2

I am using a HPC server to run the R analysis, trying to upload a previously saved R data file at around 9 GB.

lnames <- load("LD.sixPop.RData")

Error: cannot allocate vector of size 4.7 Gb
Execution halted
Warning message:
system call failed: Cannot allocate memory 

The node I was running the job has 32CPUs and total 256GB RAM. How could R complain about the memory issue? I tried clean garbage with gc(), but it didnot solve the issue. Another issue related: each time I tried to open R in terminal, it worked hard to restore the saved RData even without the command "load". As it was running at the head node, I have to cancel it in order not to take much memory.

The R sessionInfo() output is as follows:

sessionInfo()

R version 3.1.2 (2014-10-31)
Platform: x86_64-unknown-linux-gnu (64-bit)

Any suggestion will be welcomed.

IsaacCisneros
  • 1,953
  • 2
  • 20
  • 30
user1786747
  • 85
  • 1
  • 4
  • 1
    There may be restrictions on the amount of memory a single user and/or process can use on the machine. Ask your system administrator. – Joshua Ulrich Jan 09 '16 at 21:36
  • The output of `load()` should not be assigned to a variable. You can load the previously stored data with `load("LD.sixPop.RData")`, i.e. **without** the `lnames <- ` part. – RHertel Jan 09 '16 at 21:59
  • @RHertel - `load` returns a character vector of loaded objects' names. Sometimes assigning that to an object makes sense. – jbaums Jan 09 '16 at 22:01
  • 1
    @jbaums I agree. But usually one wants to load the data, and not just the names of the variables that have been stored. I have the impression that it is a common mistake to believe that the content of a previously stored variable `lnames` could be retrieved with `lnames <- load(...)`. – RHertel Jan 09 '16 at 22:05

0 Answers0