2

A very similar question was asked for the .net Windows environment:

How do you protect yourself from runaway memory consumption bringing down the PC?

This question is posed for the Unix environment using C or C++. As an example, consider the dangerous little block below:

#include <vector>
int main() {
  int n;
  std::vector<double> A(n);
}

If you're lucky the code will throw a range error. If you're unlucky (in my case the value of a in memory was 283740392) the code will quickly use all the available RAM and cause massive swapping to disk, grinding the OS to a virtual standstill faster then it can be killed. The process can always be killed eventually of course, but it may often take minutes to recover as all the other running processes have to be loaded back into memory. This is not a question whose answer implies a lack of RAM, one could easily give a runaway process that swamps any available machine.

Community
  • 1
  • 1
Hooked
  • 84,485
  • 43
  • 192
  • 261

4 Answers4

5

As suggested in a comment, either ulimit -v <size> before you start your program, or setrlimit programmatically within it.

Mark B
  • 95,107
  • 10
  • 109
  • 188
1

In linux there is something called 'the OOM killer'. It kills processes based on memory usage and history of cpu usage.

doron
  • 27,972
  • 12
  • 65
  • 103
Jan
  • 1,807
  • 13
  • 26
  • 1
    You don't really want things to go so bad that you need the OOM killer. It's just a fallback for when the sysadmin didn't bothered setting appropriate limits. – ninjalj Oct 29 '11 at 11:26
1

You are looking for the setrlimit call. Previously.

Community
  • 1
  • 1
themel
  • 8,825
  • 2
  • 32
  • 31
0

A possibly better but more complicated answer (than ulimit) would be a custom allocator, that limits the maximum size of any given allocation (100MB or so), but does not limit the total memory allowed to be used..

Mooing Duck
  • 64,318
  • 19
  • 100
  • 158