0

I have a memory issue with R. I'm running a big and complex analysis on R installed via anaconda in my home on a cluster from my institution. I created my environment so that I can install packages without problems. While running the following error occurs:

   Error: cannot allocate vector of size 26.7 Gb

I checked the memory limit by typing memory.limit() but it appears to be Inf.

This is the output of my sessionInfo()

 R version 3.6.1 (2019-07-05)
 Platform: x86_64-conda_cos6-linux-gnu (64-bit)
 Running under: CentOS Linux 7 (Core)

 Matrix products: default
 BLAS/LAPACK: /home/user/miniconda3/envs/py37/lib/libopenblasp-r0.3.7.so

 locale:
 [1] LC_CTYPE=en_US.UTF-8       LC_NUMERIC=C              
 [3] LC_TIME=en_US.UTF-8        LC_COLLATE=en_US.UTF-8    
 [5] LC_MONETARY=en_US.UTF-8    LC_MESSAGES=en_US.UTF-8   
 [7] LC_PAPER=en_US.UTF-8       LC_NAME=C                 
 [9] LC_ADDRESS=C               LC_TELEPHONE=C            
 [11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C       

 attached base packages:
 [1] stats     graphics  grDevices utils     datasets  methods   base     

 loaded via a namespace (and not attached):
 [1] compiler_3.6.1 tools_3.6.1   

While running the tool responsible for the problem I performed ssh to a single node with 24 cores to allow the pipeline to run. Can anyone help me to solve this problem? Thank you in advance!

Community
  • 1
  • 1
NewUsr_stat
  • 2,351
  • 5
  • 28
  • 38

0 Answers0