1

Strangely, I haven't found the answer and perhaps there's no solution. When I load a large file in R (a .csv of 2G with fread for example) it uses roughly 2G of system memory but even after deleting the imported data in R with rm() and gc() it still uses roughly 2G of system memory. My question is : is there a way to unleash the system memory keeped by R without restart R after deleting unused data?

easy.unleash.memory()    :o)

Linux, 64bits, R version 3.4.3 (2017-11-30)

I'm aware that several older posts are close to my question but they don't solve my problem. It's possible enough that there's no answer but in this case it will be said!

YannickN.fr
  • 328
  • 2
  • 6
  • 18
  • Ok, thanks a lot, I'm fully aware of this post but as I said rm() and gc() (even with options) don't work in my case. I'm not under Windows and I don't want to restart R. – YannickN.fr Jul 05 '18 at 14:23
  • 2
    The answer is not any different. If you ran `gc()` then that's the extent of what you can do. Maybe read more about memory in R: http://adv-r.had.co.nz/memory.html. – MrFlick Jul 05 '18 at 14:28

0 Answers0