0

I know that some version of this question has been addressed multiple times in the past, but I think this iteration of this widely shared problem is sufficiently distinct to justify its own response. I would like to permanently set the maximum memory available to R to largest value that my machine can handle, i.e., not just for a single session. I am running 64-bit R on a windows 7 machine with 6 gig of RAM.

Currently I am trying to do a conversion of a 10 GB Stata file into a .rds object. On similar smaller objects the compression in the .dta to .rds conversion has been by a factor of four or better, and I (rather surprisingly) have not had any trouble doing dplyr manipulation on objects of 2 to 3 GB (after compression), even when two of them and work product are all in memory at once. This seems to conflict with my previous belief that the amount of physical RAM is the absolute upper limit of what R can handle, as I am fairly certain that between loaded .rds objects and various intermediate work products I have had more than 6 GB of undeleted objects laying about my workspace at one time.

I find conflicting statements about whether the maximum memory size is my actual RAM less OS demands, or my actual RAM, or my actual RAM plus an unknown (to me) amount of virtual RAM (subject to a potentially serious slowdown when you reach into virtual RAM). These file conversions are one-time (per file) jobs and I do not care if they are slow.

Looking at the base R help page on “Memory limits” and the help-pages for memory.size(), it seems that there are multiple distinct limits under Windows, relating to total memory used in a session, available to a single process, allocatable by malloc or contained in a single vector. The individual vectors in my file are only around eight million rows long.

memory.size and memory.limit both report current settings in the neighborhood of 6 GB. I got multiple warning messages saying that I was pressed up against that limit, but the actual error message was something like “cannot allocate vector of length 120 MB”.

So I think there are three distinct questions:

  1. How do I determine the maximum possible memory for each 64-bit R memory setting; and
  2. How many distinct memory settings do I need to make; and
  3. How do I make them permanently, as opposed to for a single session?

Following the advice of @Konrad below, I had this rather puzzling exchange with R/RStudio:

> memory.size()
[1] 424.85
> memory.size(max=TRUE)
[1] 454.94
> memory.size()
[1] 436.89
> memory.size(5000)
[1] 6046
Warning message:
In memory.size(5000) : cannot decrease memory limit: ignored
> memory.size()
[1] 446.27

The first three interactions seem to suggest that there is a hard memory limit on my machine of 455 MB. The second-to-last one, on the other hand, appears to be saying that the memory limit is set at my RAM level, without allowance for the OS, and without using virtual memory. Then the last one goes back claiming to a limit of around 450.

I just tried the recommendation here: Increasing (or decreasing) the memory available to R processes but with 6000 MB rather than 500; I'll provide a report.

Community
  • 1
  • 1
andrewH
  • 2,281
  • 2
  • 22
  • 32
  • 1
    On Windows you could try: `memory.size(max = TRUE)` to let **R** use all available memory. In terms of the actual operations, you could try exploring the command: `pryr::mem_change(stuff you want to do)` to see how much memory it may consume. – Konrad Sep 10 '16 at 09:02
  • What do you mean by `permanently`? The size of available memory depends on the status of your PC. Say that you start R after opening a bunch of memory consuming processes that are running. No R setting will force the OS to free that memory, so R will have less memory available. Even in the same R session things may change depending on how many processes are running. – nicola Sep 10 '16 at 09:14
  • @nicola I think by permanently I mean two things. First, that it is done in .Rprofile or an environment variable or something to make it the default, and second, that when a process fails for lack of memory I want it to be because there is not enough memory, not because of the settings. I know that means I shouldn't run memory-intensive processes with 200 browser windows open and a video playing. – andrewH Sep 10 '16 at 22:25

0 Answers0