32

I've been working for a while with a number of large files containing gene expression data, and I've recently run into an issue with loading that data into R, after upgrading to R 3.5.0. After using about 8GB of memory (my mac has 16GB of RAM), if I try to read in another file, I get the following error:

Error: vector memory exhausted (limit reached?)

I found a previous post (Error: vector memory exhausted (limit reached?)) suggesting I try to set the environmental variable R_MAX_VSIZE to a higher value, so I tried the following:

Sys.setenv(R_MAX_VSIZE = 16e9)

However, I still got the same error. Am I not setting the environmental Variable correctly? is there something that I'm missing?

Session info:

R version 3.5.0 (2018-04-23)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS High Sierra 10.13.5

Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRlapack.dylib

locale:[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages: [1] data.table_1.11.4

loaded via a namespace (and not attached):
[1] compiler_3.5.0 tools_3.5.0   
Graeme Frost
  • 2,438
  • 2
  • 13
  • 15
  • 2
    I get the "Error: vector memory exhausted (limit reached?)" at super random times in R 3.5.0....I check the memory used up and R has around 2GB on a 16GB mac laptop. It's clearly not out of memory. `Sys.setenv('R_MAX_VSIZE'=32000000000)` sets the variable correctly but the error keeps reappearing irrespective. – Gordon McDonald Aug 22 '18 at 03:06
  • 1
    I have the same problem with R 3.5.1. My Mac has 8GB of memory, and 80% was used when the error appeared, so it clearly is not because I'm running out of memory. `Sys.setenv('R_MAX_VSIZE'=32000000000)` does not change the `limit` term when I call `gc()`. I have access to a server that doesn't have a limit on vector memory, so I'll have to use it. – Lambda Moses Aug 23 '18 at 21:06

3 Answers3

28

For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000) only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:

Error: vector memory exhausted (limit reached?)

After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:

Step 1: Open terminal,

Step 2:

cd ~
touch .Renviron
open .Renviron

Step 3: Save the following as the first line of .Renviron:

R_MAX_VSIZE=100Gb 

Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine

Graeme Frost
  • 2,438
  • 2
  • 13
  • 15
15

R 3.5 has a new system limit on for memory allocation. From the release notes:

The environment variable R_MAX_VSIZE can now be used to specify the maximal vector heap size. On macOS, unless specified by this environment variable, the maximal vector heap size is set to the maximum of 16GB and the available physical memory. This is to avoid having the R process killed when macOS over-commits memory.

You can override this. You risk overallocating and killing the process, but that is probably what was happening if you hit a hard wall with R 3.4.4 or whatever you were using before.

Execute the following in Terminal to create a temporary environmental variable R_MAX_VSIZE with value 32GB (change to suit): export R_MAX_VSIZE=32000000000

Or if you don't want to open Terminal and run that every time you want to start an R session, you can append the same line to your bash profile. Open Terminal and find your bash profile open .bash_profile and, in a text editor, add the line from above.

You will still have to open Terminal and start R from there. You can run R in the terminal by just executing R or you can open the GUI open -n /Applications/R.app.

To make this change in an R session use Sys.setenv('R_MAX_VSIZE'=32000000000) and to check the value use Sys.getenv('R_MAX_VSIZE')

divibisan
  • 11,659
  • 11
  • 40
  • 58
Connor Dibble
  • 517
  • 3
  • 13
  • 3
    Doesn't fix the problem for me. Sad face. – Gordon McDonald Aug 23 '18 at 02:29
  • @GordonMcdonald Maybe we can help with a bit more info. Are you finding that after updating R you are getting the vector memory error using code that executed without a problem prior to the update? If that is the case, this solution may help if you adjust upwards the number you assign to `R_MAX_SIZE`. – Connor Dibble Sep 06 '18 at 20:59
  • 2
    @GordonMcDonald `Sys.setenv('R_MAX_VSIZE'=32000000000)` did not work for me. [Apparently](https://stackoverflow.com/questions/52029851/vector-memory-exhausted-using-r-package-dtw#comment91156723_52029851), you cannot set R_MAX_VSIZE [after R is started](http://r.789695.n4.nabble.com/R-3-5-0-vector-memory-exhausted-error-on-readBin-tp4750237p4750244.html). Using the answer provided by @GraemeFrost worked for me. – bonna Feb 15 '19 at 23:17
10

A solution for those who might be unfamiliar with the command line can be found here:

In short, the solution is to use the usethis package.

usethis::edit_r_environ() will open the .Renviron which is in your home directory. This .Renviron affects all Rstudio work

usethis::edit_r_environ("project") will open an .Renviron local to your project. Changes made to this file only affect work done in that particular Rstudio project.

Once open, the R_MAX_VSIZE var can be set.

The linked page also links to this blog which describes R's startup process in great detail.

Mir Henglin
  • 629
  • 5
  • 15
  • 3
    Just a clarification note. In the above code, the argument is literally "project". It is not meant to be replaced with the name of your project. – Adam_G Feb 02 '20 at 22:28
  • This is really straightforward, thanks! Had to experiment a bit with the 'R_MAX_VSIZE' value to work out what was large enough. At 64Gb it runs without errors. – EcologyTom May 12 '21 at 14:37