I keep a large cache in a dictionary with value IEnumerable<KeyValuePair<DateTime, Double>>
. I remove items from the dictionary periodically and add items to the dictionary periodically. Every now and again I get a System.OutOfMemoryException. I wanted to know why doesn't the garbage collector come to my rescue?

- 1,421,763
- 867
- 9,128
- 9,194

- 705
- 3
- 8
- 19
-
1How many items do you keep in your cache? 1000? 1000000? I can see that maybe the large object heap gets fragmented and you wouldn't be able to grow the dictionary after a point. – Gabe Mar 17 '11 at 07:26
-
You do realise that items stored in the dictionary won't be garbage collected right? At least not until they are removed from the Dictionary, and any other references released. – Josh Smeaton Mar 17 '11 at 07:28
-
I would recommend to run a memory profiler to see what objects are being held in the memory. Or get the crash dump and examine the heap usage. – Chansik Im Mar 17 '11 at 07:58
-
*why doesn't the garbage collector come to my rescue?* The GC does not stop you from shooting yourself in the foot ;) – MattDavey Mar 17 '11 at 09:09
4 Answers
Remember that a Heap can get fragmented as @Gabe mentioned. Even though you might have free memory it might not have a chunk large enough to allocate the dictionary to when it performs it's resize.
Perhaps you could use the caching block from the patterns and practices library MSDN Library Link that will assist you in implementing a good cache. Maybe you could choose an algorithm that doesn't dynamically allocate memory, with a fixed number of entries?
Also note that if there isn't any memory that you can use then it's a problem with the size of your cache, not the garbage collector.

- 28,526
- 15
- 68
- 103
It's quite possible that the GC is coming to your rescue for a lot of the time, but that you're just going beyond its capabilities sometimes.
Just to be absolutely clear, is this a Dictionary<DateTime, Double>
or a Dictionary<SomeKeyType, IEnumerable<KeyValuePair<DateTime, Double>>>
? If it's the latter, then perhaps you're holding onto references somewhere else?
How large is your cache getting? Do you have monitoring to keep track of it? What makes you think it's the dictionary which is causing the problem? If you have control over how much you cache, have you tried reducing the size?

- 1,421,763
- 867
- 9,128
- 9,194
-
I have a dictionary with 500 elements, each element has a key of SomeKeyType, the IEnumerable
> has approx 5000 Key Value pairs in each. So thats 5000 * 500 approx 2,500,000 KVPs. From what I gather, DateTime is 8 bytes, and Double is 8 bytes, that's 16 bytes per element, so approximately 40 megabytes. I don't see why I'm having a problem with this at all. I tend to remove elements from the dictionary before adding new ones. – Anish Patel Mar 17 '11 at 09:26 -
@Anish: Well, you haven't said what implementation of `IEnumerable
` you're using. For example, if you're using a `LinkedList` then there'll be pretty significant overhead per element. Even then, I wouldn't expect 2.5M KVPs to be a problem. What leads you to believe it's this cache that's causing the problem? – Jon Skeet Mar 17 '11 at 10:57
Since you're asking why the GC doesn't rescue you I'll give an answer to that.
Using a programming language/environment with a garbage collector makes live easier for you but won't make memory management a thing of the past.
If you assign big chunk of memory going over about 2 gig in a 32bit xp machine you're just reached one of the first .Net memory boundaries. Keeping 2 gig in memory is always a bad idea.
On a memory constrained machine running a huge database or the like you'll quickly hit the boundaries of available memory. Since the GC is not OS aware it might nog notice a low memory situation in time (creating huge objects like bitmaps can trigger this situation). Manually calling GC.Collect once you're set a huge object to nothing help a lot here.
Keeping a big dictionary in memory is a very simple description. Can you tell us whats in the collection and how big those items are theoretically.
If you mean big like 2,147,483,647 items you could have hit the Integer size limit.
To sum up:
- Don't keep unneeded items in memory or swap them out to disk.
- Do call GC.Collect once you've freed 'big' items (but not in a loop of deleting items, after the loop please)

- 3,392
- 2
- 27
- 54
-
Oh, and measure how much memory you're application is using with performance counters, that'll tell you whats going on. – CodingBarfield Mar 17 '11 at 09:09
I'm not sure, escuse me if I'm wrong but maybe that the Dictionary is stored in the Large Object Heap when larger than 85kB (like a byte[90000])
Like Gabe Said :
I can see that maybe the large object heap gets fragmented and you wouldn't be able to grow the dictionary after a point
When the LOH get fragmented, it sometime doesn't have enough space to stored object with contigus adress. That's what is causing the OutOfMemory Exception. It more like a Out Of Contigus space in LOH exception.

- 22,699
- 14
- 85
- 105

- 1,826
- 1
- 24
- 29