1

After running a long script, I looked at Windows Task Manager (Windows 7 64bit) and saw that over 3 gig of memory was used. I removed all variables from memory. The memory was still used. So I searched SO and found this post. I tried running gc(). I get the following:

> gc()
          used (Mb) gc trigger  (Mb)  max used   (Mb)
Ncells  893182 47.8    1835812  98.1   1835812   98.1
Vcells 2061325 15.8   19962389 152.4 407302954 3107.5

This is after a few minutes of waiting. I was expecting much of the memory to be freed. Am I missing something? Thanks.

Community
  • 1
  • 1
rmacey
  • 599
  • 3
  • 22
  • it is usally a good habit to follow rm(list=ls()) by a gc() call - to prompt R to free the memory. – GWD Dec 21 '15 at 13:36
  • 2
    No, @wd11, it's not. Most of these gains are mythical. – Dirk Eddelbuettel Dec 21 '15 at 13:39
  • who am I to argue with you - Dirk; but I have experienced that 'long' code streaks gained a bit on the stability side if I threw in an occassional gc() every now and then ... especially when running code in the RStudio IDE on Windows it semmed to help; nevertheless could be that wishfull thinking obscured my perception ... – GWD Dec 21 '15 at 13:46

1 Answers1

3

One possible explanation, from Hadley Wickham's Advanced R book:

This number [i.e. the amount reported "used" by pryr::mc(), which should agree with the values given by gc()] won’t agree with the amount of memory reported by your operating system for a number of reasons: ... ...

  1. Both R and the operating system are lazy: they won’t reclaim memory until it’s actually needed. R might be holding on to memory because the OS hasn’t yet asked for it back.
Ben Bolker
  • 211,554
  • 25
  • 370
  • 453