22

I've been running some memory intensive processes on EC2 servers. The code runs quite well for about 12-14 hours (it's running 1000s of simulations on 12-14 large datasets) and then all of a sudden I just see the message "Killed" with no further explanation.

What makes R do that?

UPDATE: My server specs.

Maiasaura
  • 32,226
  • 27
  • 104
  • 108

2 Answers2

14

It could be the out of memory killer of the operating system.

Are you cleaning up your workspace when you have finished with a dataset?

James
  • 65,548
  • 14
  • 155
  • 193
  • I verified it was the OOM killer on Ubuntu that killed my R process by checking the last OOM killed processes using `dmesg -T | egrep -i 'killed process' ` cf. https://stackoverflow.com/questions/624857/finding-which-process-was-killed-by-linux-oom-killer – Rasmus Aug 30 '22 at 07:22
5

From what I know, I don't think R has a "killed" error. Most likely it's your operating system imposing a process limit or some kind of quotas. If you are working on a network system, maybe ask your sysadmin?

Xzhsh
  • 2,239
  • 2
  • 22
  • 32
  • The entire server is mine. I have no limits so to speak. It's a server with 67 gigs of RAM and I batch the scripts so more than one isn't running at a time. So I am a little puzzled why it quits without further explanation. I source the script from within R. – Maiasaura Aug 06 '10 at 18:02
  • 1
    Hum... https://stat.ethz.ch/pipermail/r-help/2004-April/049212.html, afaik R doesn't have Killed as an error though... are you sure there isn't any default process limits? I'd check your settings. Also, make sure you aren't using the 32bit version of R, and check how much memory your installation of ubuntu sees with the free command – Xzhsh Aug 06 '10 at 18:16
  • No problem, I hope you can resolve your issue some time soon. – Xzhsh Aug 06 '10 at 18:30
  • 2
    I just ran an R script and it said "Killed", so yeah, it's got one...10 years after this question was posed. – Lars Ericson Sep 21 '20 at 00:53