Short version
Is there a way to prevent R from ever using any virtual memory on a unix machine? Whenever it happens it is because I screwed up and I then want to abort the computation.
Longer version
I am working with a big datasets on a powerful computer shared with several other people. Sometimes I set off commands that requires more RAM than is available, which causes R to start swapping and eventually freeze the whole machine. Normally I can solve this by setting a ulimit
in my ~/.bashrc
ulimit -m 33554432 -v 33554432 # 32 GB RAM of the total 64 GB
which causes R to throw an error and abort when trying to allocate more memory than is available. However, if I make a misstake of this sort when parallelizing (typically using the snow
package) the ulimit
has no effect and the machine crashes anyway. I guess that is because snow
launches the workers as separate processes that are not run in bash. If I instead try to set the ulimit
in my ~/.Rprofile
I just get an error:
> system("ulimit -m 33554432 -v 33554432")
ulimit: 1: too many arguments
Could someone help me figure out a way to accomplish this?
Side track
Why can I not set a ulimit
of 0 virtual memory in bash
?
$ ulimit -m 33554432 -v 0
If I do it quickly shuts down.