16

We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.

seandavi
  • 2,818
  • 4
  • 25
  • 52
  • 1
    I had problems with this before too ([link](http://stackoverflow.com/questions/10303241/prevent-r-from-using-virtual-memory-on-unix-linux)), which might be related to your problem. The solution we ended up with was to entirely disable memory overcommiting on the machine. It is a blunt solution but has worked fine. – Backlin Sep 25 '12 at 12:39
  • 1
    If, by chance, you use [RStudio server](http://rstudio.org/docs/server/configuration), you can set user limits by adding a line like `rsession-memory-limit-mb=4000` to `/etc/rstudio/rserver.conf` – GSee Sep 25 '12 at 12:57
  • 1
    is this http://unix.stackexchange.com/questions/44985/limit-memory-usage-for-a-single-linux-process useful? (i.e., not an R-specific approach, but if you can come up with a generic per-process solution that works on your OS, then you can set up an alias for R that imposes it ... Seems like this https://github.com/pshved/timeout would be particularly useful – Ben Bolker Sep 25 '12 at 13:23
  • `ulimit` works fine until you want to use all your cores. – otsaw Sep 25 '12 at 14:51

2 Answers2

17

There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.

In my .Rprofile I have

unix::rlimit_as(1e12, 1e12)

to limit memory usage to ~12 GB.

Before that...

I had created a small R package, ulimit with similar functionality.

Install it from GitHub using

devtools::install_github("krlmlr/ulimit")

To limit the memory available to R to 2000 MiB, call:

ulimit::memory_limit(2000)

Now:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
krlmlr
  • 25,056
  • 14
  • 120
  • 217
  • 3
    As you say on GitHub, that'll work on two of the three OSs only and most newbs work on the third. May be worthwhile noting somewhere here ... – Dirk Eddelbuettel Jun 29 '14 at 12:07
  • @DirkEddelbuettel: Good point. Windows users seem to have `memory.limiit()` at their disposal. My first objective was to get it up and running for my system... – krlmlr Jun 29 '14 at 14:14
  • @krlmlr please explain how this is necessary given that https://mran.revolutionanalytics.com/.is available. Is it complementing it in some way? – SemanticBeeng Jul 28 '15 at 05:18
  • @SemanticBeeng: I don't understand the question. What is "this", how it is related to MRAN, and what do you mean by "complementing"? – krlmlr Jul 28 '15 at 06:05
  • @krlmlr, see I posted a broken link. Meant to reference http://www.revolutionanalytics.com/revolution-r-open and the fact that this implementation is supposed to not be limited to RAM in how it manages data. See http://www.revolutionanalytics.com/revolution-r-enterprise-scaler : "No Memory Barriers Revolution R Enterprise ScaleR algorithms are implemented as Parallel External Memory Algorithms (PEMAs). By managing available RAM and permanent storage together, PEMAs are able to analyze data well beyond the limits of available memory. ". Is my point making more sense now? – SemanticBeeng Jul 28 '15 at 13:45
  • @SemanticBeeng: My package is to stop R eating all of my laptop's RAM, the enterprise scaler (which I'm not familiar with) probably has a slightly different scope. – krlmlr Jul 28 '15 at 14:05
  • 1
    Any plans to put this on CRAN? – MichaelChirico Jan 02 '17 at 16:41
  • 1
    @MichaelChirico: Not for now, planning to somehow merge it with RAppArmor which offers more of the ulimit API but has other drawbacks. – krlmlr Jan 02 '17 at 20:59
8

?"Memory-limits" suggests using ulimit or limit.

There is a command line flag: --max-mem-size which can set the initial limit. This can be increased by the user during the session by using memory.limit.

James
  • 65,548
  • 14
  • 155
  • 193
  • 9
    Thanks, James. --max-mem-size is now gone from R and memory.limit only applies on Windows. ulimit and limit look like the only way to go. – seandavi Sep 25 '12 at 13:07