Strangely, I haven't found the answer and perhaps there's no solution. When I load a large file in R (a .csv of 2G with fread
for example) it uses roughly 2G of system memory but even after deleting the imported data in R with rm() and gc()
it still uses roughly 2G of system memory. My question is : is there a way to unleash the system memory keeped by R without restart R after deleting unused data?
easy.unleash.memory() :o)
Linux, 64bits, R version 3.4.3 (2017-11-30)
I'm aware that several older posts are close to my question but they don't solve my problem. It's possible enough that there's no answer but in this case it will be said!