2

I have the following code for loading some data in my .Rprofile (which is a R script in my project folder running automatically when I switch to the project with Rstudio).

data_files <- list.files(pattern="\\.(RData|rda)$")
if("data.rda" %in% data_files) {
  attach(what="data.rda", 
         pos = 2)
  cat("The file 'data.rda' was attached to the search path under 'file:data.rda'.\n\n")
}

The data being loaded is relatively big:

                             Type       Size     PrettySize    Rows Columns
individual_viewings_26 data.frame 1547911120   [1] "1.4 Gb" 3685312      63
viewing_statements_all data.table  892316088   [1] "851 Mb" 3431935      38
weights                data.frame  373135464 [1] "355.8 Mb" 3331538      14
pet                    data.table   63926168    [1] "61 Mb"  227384      34

But I have 16 GB and I can allocate them:

> memory.limit()
[1] 16289

When my data was not as big, I did not have any issue. I recently saved some more data frames in data.rda and my R session suddenly fails at start-up (when I switch to the project in Rstudio and .Rprofile is executed):

Error: cannot allocate vector of size 26.2 Mb
In addition: Warning messages:
1: Reached total allocation of 2047Mb: see help(memory.size) 
2: Reached total allocation of 2047Mb: see help(memory.size) 
3: Reached total allocation of 2047Mb: see help(memory.size) 
4: Reached total allocation of 2047Mb: see help(memory.size) 

I suspect that for some reason, the memory limit is set at 2GB at boot? Any way I can change that?

Edit: Added OS and software version

> sessionInfo()
R version 3.2.2 (2015-08-14)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1

Edit2: Just to clarify, I am able to load the data myself by running the code, I have plenty of available memory and the R process commonly uses up to 10GB during my daily work. The problem is, there is a apparently a 2GB memory limit when R boots and executes the .Rprofile...

asachet
  • 6,620
  • 2
  • 30
  • 74
  • The system often don't allow a single program or process to allocate all memory. Since the answer is system specific please add what operation system you are using – nist Dec 11 '15 at 10:50
  • Possibly related: http://stackoverflow.com/q/10917532/4770166 – RHertel Dec 11 '15 at 10:54
  • @RHertel: does not seem related to me, I do have more than enough memory (12-14 GB free...). – asachet Dec 11 '15 at 11:29
  • I think this is still somewhat related. You can see the maximum memory that can be allocated using `memory.limit()`. You can also use the same function to increase that maximum. – Sam Dickson Dec 11 '15 at 14:26
  • @SamDickson As you can see in my post, I am aware of the function and it does show that the 16GB are available. My system monitor shows that I am using 4GB out of 16GB so the available memory is not the issue. Furthermore, I am able to load the data once R has started. My question is about why the loading code seem to hit a 2GB memory limit when the loading code is written in the .Rprofile file. – asachet Dec 14 '15 at 10:05
  • You did give me the idea to investigate by printing the memory.limit from the .Rprofile to see what it shows and if I can change it at boot, thanks! – asachet Dec 14 '15 at 10:06

1 Answers1

2

Yes, there is a 2GB limit when R starts, at least when the user profile (.Rprofile files and .First() functions) are executed.

Proof:

Content of Rprofile:

message("Available memory when .Rprofile is sourced: ", memory.limit())

.First <- function() {
  message("Available memory when .First() is called: ", memory.limit())
}

Output at startup

Available memory when .Rprofile is sourced: 2047
Available memory when .First() is called: 2047

Output of memory.limit once R has started

> memory.limit()
[1] 16289
asachet
  • 6,620
  • 2
  • 30
  • 74