I am using a HPC server to run the R analysis, trying to upload a previously saved R data file at around 9 GB.
lnames <- load("LD.sixPop.RData")
Error: cannot allocate vector of size 4.7 Gb
Execution halted
Warning message:
system call failed: Cannot allocate memory
The node I was running the job has 32CPUs and total 256GB RAM. How could R complain about the memory issue? I tried clean garbage with gc(), but it didnot solve the issue. Another issue related: each time I tried to open R in terminal, it worked hard to restore the saved RData even without the command "load". As it was running at the head node, I have to cancel it in order not to take much memory.
The R sessionInfo() output is as follows:
sessionInfo()
R version 3.1.2 (2014-10-31)
Platform: x86_64-unknown-linux-gnu (64-bit)
Any suggestion will be welcomed.