I'm tasked with improving a piece of code that generates massive reports, in any way I see fit.
There are about 10 identical reports generated (for each 'section' of the database) , and the code for them is similar to this:
GeneratePurchaseReport(Country.France, ProductType.Chair);
GC.Collect();
GeneratePurchaseReport(Country.France, ProductType.Table);
GC.Collect();
GeneratePurchaseReport(Country.Italy, ProductType.Chair);
GC.Collect();
GeneratePurchaseReport(Country.Italy, ProductType.Table);
GC.Collect();
If I remove those GC.Collect()
calls, the reporting service crashes with OutOfMemoryException
.
The bulk of the memory is kept in a massive List<T>
which is filled inside GeneratePurchaseReport
and is no longer of use as soon as it exits - which is why a full GC collection will reclaim the memory.
My question is two-fold:
- Why doesn't the GC do this on its own? As soon as it's running out of memory on the second
GeneratePurchaseReport
it should do a full collection before crashing and burning, shouldn't it? - Is there a memory limit which I can raise somehow? I don't mind at all if data is swapped to disk, but the .net process is using far less memory than even the available 2.5GB of RAM! I'd expect it to only crash if it's run out of address space but on a 64-bit machine I doubt that happens so soon.