When I clear the Global environment in R, and then invoke garbage collection, R continues to hog memory (see code below). Is there a way to clean up memory "even more" without restarting R?
> rm(list = ls())
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1352699 72.3 2419890 129.3 3886542 207.6
Vcells 4231877 32.3 910657255 6947.8 1412741750 10778.4
> unloadlibs() # wrapper to remove all packages in 'sessionInfo()$otherPkgs'
[1] "Unloaded packages:"
[1] "magrittr" "zoo" "reshape" "stringi" "openxlsx" "lubridate" "dtplyr" "dplyr" "RODBC"
[11] "data.table"
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1349320 72.1 2419890 129.3 3886542 207.6
Vcells 4230328 32.3 728525804 5558.3 1412741750 10778.4
In this case, there is still more than 700mb of memory used after cleaning everything.
Is it a known issue, or am I doing something wrong (in this case, what is it)? What may be the culprit and faulty code that may result in this? I already :
- Dropped dplyr usage,
- Try to use data.table in any case possible,
- And use copy() in the code when I am afraid that data.table references will somehow confuse garbage collection (not sure how a single reference to a data.table will behave).
Thanks for your inputs.
Additional info:
> R.version
platform x86_64-w64-mingw32
arch x86_64
os mingw32
system x86_64, mingw32
status
major 3
minor 4.3
year 2017
month 11
day 30
svn rev 73796
language R
version.string R version 3.4.3 (2017-11-30)
nickname Kite-Eating Tree