5

One of issues with CLR is it's extremely bad behavior in the lack of RAM (when some of memory of managed process gets paged out, this leads to the total freeze of the whole system, even Ctrl-Alt-Del screen can't be accessed. I assume the reason is GC which tries to build graph of reachable objects and tries to scan all memory of the process, causing massive page in/page out operations).
This makes a problem for my .NET program, coz it can consume lots of RAM when input data is huge.

I'd prefer to show a "not enough memory" message to the user, rather than completely hang his system ^_^
Is there any way to achieve this?

fithu
  • 2,361
  • 5
  • 18
  • 23
  • 3
    The root cause of your problem is probably that your computer has too little physical memory to effectively execute the desired workload. The best solution is probably to increase the physical memory or decrease the workload. – Martin Liversage Nov 10 '10 at 14:06
  • 2
    Agreed. And a badly fragmented paging file probably. – Hans Passant Nov 10 '10 at 14:26
  • 1
    @Martin, My computer is ok. The problem is that input data can be of any size, thus any amount of RAM can be too small. – fithu Nov 10 '10 at 14:32
  • @Hans, bad guess. GC/paging problem can kill any PC. – fithu Nov 10 '10 at 14:35
  • 3
    @fithu: Then perhaps you need to rewrite your application to not read all input data into RAM. Or if you want to block the user from reading too much data give the error message before the data is read instead of trying to come up with a hack that allows you to inform the user that too much has been read. – Martin Liversage Nov 10 '10 at 14:36
  • @Martin, my own index data takes the most of RAM, not input. And I can't know how much RAM will be necessary, as amount of my index data depends on input size and structure in a very nontrivial way. – fithu Nov 10 '10 at 14:46
  • @Fithu, databases were created for a reason... – Ian Ringrose Nov 10 '10 at 17:01
  • @Ian, databases are good for scalability, but kill performance on small datasets. Plain RAM is tens faster. Actually, I don't need to process large data sets. I just need to ensure it won't kill the system if someone try to feed large data ^_^ – fithu Nov 10 '10 at 18:15
  • @fithu, don't let them get you down. There is a blind spot here where .Net adherants don't really believe there is a good case for artificially limiting memory used by a .Net process as compared to the JVM's max heap size paramters. Saying you should rewrite your app to not consume so much memory is like going to a doctor with "it hurts when I do *this*" and his response being "don't do *that*". However true it might be, it is equally unhelpful. – Kelly S. French Oct 11 '11 at 15:45

4 Answers4

3

With MemoryFailPoint, you can tell .NET you are going to need a certain amount of memory. The problem though is that even this system includes swap space.

I believe that it is very difficult to achieve what you want to achieve here. Say, you would use some system indicators and performance indicators to find out how much physical memory is available and based on that, perform some tasks. If after you've completed this check a different process comes in a grabs physical memory, your original calculations do not apply anymore and some of your memory is going to be pushed to swap.

I do have a suggestion. You could have a configuration setting with the maximum allowed amount of memory to be used by the application? With this, you can:

  1. Try to figure out how much resources your application consumes based on e.g. a network connection (if your application is a network server) and throttle the number of connections based on the maximum memory consumption, or

  2. You could have a second thread running that checks ever 10 seconds or minute of so the total memory consumption with GC.GetTotalMemory() and start rejecting connections (again, if your application is a network server) once you get to that maximum.

This should be a configuration setting instead of e.g. the amount of physical memory available, because you do not know what other applications are running on the machine.

Pieter van Ginkel
  • 29,160
  • 8
  • 71
  • 111
  • That sucks. I can imagine this happening. I do have a suggestion. Would it be an option to have a configuration setting with the maximum allowed amount of memory to be used by the application? I don't know how feasible this is, but maybe you can throttle your applications consumption on resources to roughly stay within these limits. You could even have a secondary thread with checks the total consumption with `GC.GetTotalMemory()` every 10 seconds or minute and start rejecting connections (if you are a network server) once you go over the limit. – Pieter van Ginkel Nov 10 '10 at 15:00
2

There are system calls to get the size of a process and the account of RAM on a machine that may help.

There used to be a lot of research on writting GCs that could cope with paging, but I expect they will never ship as RAM is getter so big these days. (The basic ideal was not to collect any objects that are paged out, and to try to collect all objects on page just before the OS paged it out. But you need to know all objects that may be pointed to by a paged out object.)

You may be able to use arrays of structs and then pass about the indexes so as to reduce the number of objects you have, and hence the number of pointers the GC have to follow. This is only worthwhile if you have a LOT of data of the same type you need in RAM.

Ian Ringrose
  • 51,220
  • 55
  • 213
  • 317
  • Well, they could at least make CLR to show error message rather than freezing the whole system, when OS tries to page out some managed memory. – fithu Nov 10 '10 at 14:55
  • @fithu, It is not that simple, any GC can cope with some level of paging depending on how well the objects are clustered. E.g are objects that point to each other mostly created at the same time. – Ian Ringrose Nov 10 '10 at 15:19
  • Yes, RAM gets bigger... yes, task is not simple.. but, anyway.. I suppose, under no circumstance any program mustn't freeze the whole system. It's obviously a bug. – fithu Nov 11 '10 at 08:02
1

In Windows, you can use a Job Object to put a hard limit on the amount of virtual memory that a process can allocate. The CLR will automatically garbage collect when it hits this limit, and throw an OutOfMemoryException if it cannot free enough space. You can also limit the working set of a process instead of the virtual memory. This allows the process to allocate as much as it likes but it will get swapped out instead of consuming RAM and the system won't hang. I've used Job Objects successfully for exactly this purpose (keeping the machine responsive while running large jobs).

To create job objects from .NET, you have to use PInvoke calls to the Win32 API. This article on CodeProject explains the procedure.

Tom Minka
  • 794
  • 6
  • 10
0

Have IIS host the process and use the memory controls in IIS to limit the amount of memory used. Hopefully this will force GC more often and minimize the need for a long-running GC session. I would imagine that this behavior is due to how your app is using memory from a generational aspect, as in a large number of small objects that are 1) long-lived and 2) getting moved around in memory.

Other links to similiar .NET/CLR memory control issues and solutions

Configure .NET CLR RAM usage
Restricting .Net CLR memory usage
Virtual and Physical Memory / OutOfMemoryException
Preventing OutOfMemoryException with GC.AddMemoryPressure()?
Force garbage collection of arrays, C#
examples of garbage collection bottlenecks
Suppressing C# garbage collection
how to profile .net garbage collector?

In case you are using the Large Object Heap (LOH)

.NET Collections and the Large Object Heap (LOH)
Large Object Heap Fragmentation

Community
  • 1
  • 1
Kelly S. French
  • 12,198
  • 10
  • 63
  • 93