5

I have an application that works with large amounts of data, and I'm thinking that, may be, sometimes the OutOfMemoryException will be thrown (For half a year, I got no single exception, but I'm just want to know all about it). As I've investigated, after this exception I can't proceed with execution of my program.

Is there any good pattern to handle such exceptions, especially to working with IDisposable classes?

Sisyphus
  • 4,181
  • 1
  • 22
  • 15
VMAtm
  • 27,943
  • 17
  • 79
  • 125
  • 3
    Designing an application to handle an OutOfMemoryException is probably the last thing you should do. Not as in "don't do it", but there are other things you should do before that. Memory is never an infinite resource, and a good designed application should always be able to run as long as some minimal criteria have been met, but if your application could use a lot of memory to speed things up, it should be designed that way, instead of being designed as "requires a lot of memory". Are you sure you've done enough to prevent the OutOfMemoryException in the first place? – Lasse V. Karlsen Jul 09 '11 at 14:30
  • @Lasse V. Karlsen I've never got such exception with my project - I'm just curious, is there any hints that should I know for that. – VMAtm Jul 09 '11 at 14:33
  • @VMAtm, run a memory profiling tool to ensure you are not leaking memory. It's the top priority thing to do. – myermian Jul 09 '11 at 14:37
  • 1
    @myermian I don't have a problem with that - I'm just was curious about such thing. – VMAtm Jul 09 '11 at 14:41
  • @VMAtm: there are more useful things to be curious about - things that actually happen, as opposed to this, which hasn't happened yet. – John Saunders Jul 09 '11 at 15:03

3 Answers3

5

In a genuine OOM scenario (more likely on x86 than x64) you are pretty doomed there. Almost anything would cause an allocation, so your best option is to die as quickly and elegantly as possible, doing minimum harm.

Since it isn't happening, don't stress overly, but avoidance is better than handling here:

  • use streaming data APIs rather than buffering everything in memory
  • re-use buffers etc
  • avoid enormous arrays/lists/etc (in truth, the most likely way to cause an OOM is to request an enormous (but single) array) - for example, a jagged array scales better than a 2D array (even on x64 there is a hard limit on the maximum size of a single array)
  • think about how you handle sparse data
  • do you read lots of strings from external sources? If so, consider around a custom interner so you don't have 20,000 different copies of common strings (country names, for example)
  • keep an eye on what you release when
  • avoid accidentally prolonged life on objects, especially via event subscriptions (notorious for accidental extensions to lifetimes)
Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • I have very little knowledge on the topic, but your 1st paragraph seems to contradict answers to [Is “Out Of Memory” A Recoverable Error?](http://stackoverflow.com/questions/333736/is-out-of-memory-a-recoverable-error). If I missed smth, could you clarify, please ? – Alexander Malakhov Nov 25 '11 at 08:45
4

There is no good pattern, OOM is a nasty exception that can strike at any moment. Which makes it almost an asynchronous exception. The only odds you have for handling it is when your code is structured to allocate large amounts of memory at the same time. So you'll have some odds to back out and unwind the program state as though nothing happened.

You need to design your program so it never needs to allocate more than about half of all available virtual memory, one gigabyte on a 32-bit machine. Dragons live beyond that amount, your program will fail on an allocation of 90 MB or less, even if there is another 500 MB of virtual memory unused. A problem induced by address space fragmentation. If you routinely cross this threshold then you need to switch to a 64-bit operating system.

Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • I've hit out-of-memory errors with a lot less than a gig allocated. I suspect that the COM Dictionary object may allocate small things on the Large Object Heap, resulting in massive nasty fragmentation. I am puzzled by the rationale behind Microsoft's handling of the LOH, but it is what it is. – supercat Jul 11 '11 at 15:49
  • Address space fragmentation is always around to ruin your day. You can never allocate more than ~550 MB, right after starting up the program. That goes very quickly down-hill from there. COM does not allocate from the LOH, it uses its own heap. – Hans Passant Jul 11 '11 at 16:41
  • I would have expected it to, but for some reason running a COM DLL from within my application caused memory to spiral out of control when I switched that DLL to use Dictionary rather than Collection; when the DLL is run as its own process, the Task Manager combined memory usage of DLL+application ends up being about a third of the application's usage when both run together. – supercat Jul 11 '11 at 16:58
2

The two answers before mine are correct and sound advice,
but there's one thing they haven't mentioned - which is load testing.
If you're concerned that under certain load your application might run out of memory- test it against that load (and preferably- larger loads).

In my previous job I used such a tool (HP's Performance Center) and it proved invaluable not only to check for errors and limits of our system, but also in identifying bottlenecks and costly operations.

VMAtm
  • 27,943
  • 17
  • 79
  • 125
J. Ed
  • 6,692
  • 4
  • 39
  • 55