30

i am trying to get some code working on computers with less than 4GB of RAM. i am using the 32-bit version of R to enforce that memory ceiling. i'm hitting a wall near the end of the script when i'm trying to run a memory-hogging command and everything breaks. but the memory-hogging task by itself requires less than 4GB. i've narrowed down the problem to the fact that - despite clearing all objects from memory in the current session - the R console is still holding 1.9GB of RAM. the screenshot below highlights exactly where i'm hitting the problem: note there are zero objects in memory and yet task manager says this instance of R has 1.8578GB of RAM held.

if i clear all objects from memory and then run gc() that still does not clear all memory held (as you can see in my screenshot).

is it possible to clear this memory held somehow?

if it's of any use, you can reproduce this up to the point of the crash by running this script

thank you!

enter image description here

edit: at the end of the script i get

[1] "current designing ./2011/bst.rda"
Error: cannot allocate vector of size 434.7 Mb
In addition: There were 50 or more warnings (use warnings() to see the first 50)
> gc(verbose=T)
Garbage collection 27232 = 15350+4362+7520 (level 2) ... 
31.5 Mbytes of cons cells used (49%)
450.6 Mbytes of vectors used (21%)
           used  (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells  1175911  31.5    2421436   64.7   1770749   47.3
Vcells 59048650 450.6  278146328 2122.1 461815808 3523.4
> rm(list=ls(all=TRUE))
> gc(verbose=T)
Garbage collection 27233 = 15350+4362+7521 (level 2) ... 
11.1 Mbytes of cons cells used (21%)
7.1 Mbytes of vectors used (0%)
         used (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells 414283 11.1    1937148   51.8   1770749   47.3
Vcells 920035  7.1  222517062 1697.7 461815808 3523.4
> 
Anthony Damico
  • 5,779
  • 7
  • 46
  • 77
  • 3
    this thread may help: http://stackoverflow.com/questions/14580233/why-does-gc-not-free-memory – bjoseph Aug 20 '15 at 04:33
  • what does `gc(verbose=T)` show? it may be an issue with how R and Windows Task Manager interact. – bjoseph Aug 20 '15 at 04:34
  • 7
    You (heroically) assume that there is magic switch that makes everything work with 4gb and 32bits. Well, sadly, there isn't. Which is why just about everybody who has a choice in the matter moved to a 64-bit OS, and more memory. There simply is only so-and-so much you can do with _dynamic_ language like R. – Dirk Eddelbuettel Sep 02 '15 at 02:22
  • 7
    You appear to have a memory fragmentation issue. Try allocating the largest possible vector(s) you need up front and reuse/overwrite it as necessary rather than sequentially allocating / freeing. – A. Webb Sep 02 '15 at 15:49
  • I read the `svrepdesign` code and there are some calls for `apply` inside. my experience with the apply family was that `gc` did a poor job clearing what was needed. the only solutions I can think of are so messy, I'd rather not write them down. – Elad663 Sep 19 '15 at 19:16
  • I was getting the same error on a different project. saying I needed 900+ gb of ram. I see you're using a formula. Check with `str ()` what type the values are. If it's a factor with many levels try changing it to numeric. That's what did the trick for me. From 900+ gb of ram to 10- – Bas Nov 14 '15 at 09:21
  • 1
    I know this is not the geek answer you expected or were hoping for. But as a pragmatic solution to keep you rolling and not waste more time. When I ran into these sort of problems I'd save the workspace ".RData" [maybe clean it up a bit before] - close everything - then restart R - load my ".RData" and continue with my workflow. Even up to the point where this procedure would become part of the workflow (via scripting) itself ... – GWD Dec 21 '15 at 14:05
  • have you tried `gc(reset = T)`? – RyanStochastic Feb 24 '16 at 01:37

3 Answers3

1

This is not only for R but for Windows in general. Normally, if you have removed a variable/ object in R. The process does release the memory to OS, but due to working of Windows. That memory is not released totally, it's kept in case the process request memory again so you will see as if R is still holding all that memory.

So please don't worry, that is kept for reuse :)

suhao399
  • 628
  • 7
  • 11
0

Is the memory released after you quit R?

Maybe you have some data read in disk (stored in a temporary file) rather than load to R. So gc() will not capture that.

Or do mem_change(your command here) from the very beginning to see what leads to change in memory.

Nikos
  • 3,267
  • 1
  • 25
  • 32
StayLearning
  • 601
  • 2
  • 7
  • 18
0

Try using commands

memory.limit()
# [1] 1934
memory.size()
# you can increase memory limit for that particular session in Windows machine
memory.limit(10000)  ## Size in Mbs
memory.limit
# [1] 10000   

As yours is 32-bit, your maximum limit would be 4095 MB. To see more on memory.size() and memory.limit(), use the below commands on your R console and read about it.

?memory.limit()
?memory.size()

Hope you got little help from this. You can increase your memory limit at least for that particular session.

Sowmya S. Manian
  • 3,723
  • 3
  • 18
  • 30