114

I'm trying to run an R script (in particular, I am using the "getLineages" function from the Bioconductor package, Slingshot.

I'm wondering why the error "vector memory exhausted (limit reached?)" is showing up when I use this function, as it doesn't seem to be the most memory-intensive function compared to the other functions in this package (with the data I am analyzing).

I do understand that there are other questions like this on Stackoverflow, but they all suggest to switch over to the 64-bit version of R. However, I am already using this version. There seem to be no other answers to this issue so far, I was wondering if anyone might know?

The data is only ~120mb in size, which is far less than my computer's 8GB of RAM.

R 64 bit version

Community
  • 1
  • 1
Anjan Bharadwaj
  • 1,280
  • 2
  • 10
  • 8
  • 2
    It sounds like this might address the problem: http://r.789695.n4.nabble.com/R-3-5-0-vector-memory-exhausted-error-on-readBin-td4750237.html – joran Jul 11 '18 at 23:28
  • Will take a look into that solution! – Anjan Bharadwaj Jul 11 '18 at 23:32
  • 2
    I reached this error on 3.5.1 when attempting to use ggplot2's "geom_raster" on approximately 664 lat/lon points. This solution above did not work for me. It does seem like a versioning issue as mentioned in the thread however. – Aus_10 Aug 22 '18 at 23:23
  • 1
    @Aus_10 Did you ever get that resolved? I'm running into a similar situation with geom_raster() and I realized it's due to the lat/long coordinates not being even. It works fie when I use aes(x = col, y = row), so I'm fairly sure its to do with some absurd geometry going on under the hood – LightonGlass Jun 05 '19 at 21:25

4 Answers4

158

For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000), as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:

Error: vector memory exhausted (limit reached?)

After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:

Step 1: Open terminal,

Step 2:

cd ~
touch .Renviron
open .Renviron

Step 3: Save the following as the first line of .Renviron:

R_MAX_VSIZE=100Gb 

Step 4: Close RStudio and reopen

Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine

Andrew Andrade
  • 2,608
  • 1
  • 17
  • 24
Graeme Frost
  • 2,438
  • 2
  • 13
  • 15
  • 3
    Only the .Renviron version worked for me and I'm on a terminal cmd line. – abcxyz Jun 14 '19 at 19:18
  • 3
    In case this is helpful to anyone encountering the same error (I know this thread is oldish!), you can watch R's memory usage in near realtime from the Activity Monitor. I found this useful in diagnosing the issue. – tgraybam Jul 09 '20 at 16:12
  • 5
    Perhaps obvious, but I had to reboot R for this to work – pr94 Oct 03 '21 at 12:02
35

This can be done through R studio as well.

library(usethis) 
usethis::edit_r_environ()

when the tab opens up in R studio, add this to the 1st line: R_MAX_VSIZE=100Gb (or whatever memory you wish to allocate).

Re-start R and/or restart computer and run the R command again that gave you the memory error.

Purrsia
  • 712
  • 5
  • 18
19

I had the same problem, increasing the "R_MAX_VSIZE" did not help in my case, instead cleaning the variables no longer needed solved the problem. Hope this helps those who are struggling here.

rm(large_df, large_list, large_vector, temp_variables)
Ömer An
  • 600
  • 5
  • 16
0

I had this problem when running Rcpp::sourceCpp("my_cpp_file.cpp"), resulting in

Error: vector memory exhausted (limit reached?)

changing the Makevars file solved it for me. Currently it looks like

CC=gcc
CXX=g++
CXX11=g++
CXX14=g++
cxx18=g++
cxx1X=g++
LDFLAGS=-L/usr/lib