1

I am trying to get single objects (data frames) from different RData files in the same environment, which RData files are all very big so they cannot be loaded. I get error messages that the vectors cannot be allocated due to memory issues. I have tried almost all of the previous suggestions with 'lazy loading', attach etc. but none of them seems to work as the memory limit is exceeded.

Below you may see an example of what I am trying to do:

   rm(list=ls())
   tmp.env<-new.env()
   attach("~/Data/results/chunk1.RData")
   a1<-get("a1", pos=tmp.env)
   attach("~/Data/results/chunk2.RData")
   a2<-get("a2", pos=tmp.env)
   attach("~/Data/results/chunk3.RData")
   a3<-get("a3", pos=tmp.env)

I would appreciate any suggestion that could possibly work with that big data.

peny
  • 27
  • 7
  • Could you load one RData file into your workspace using `load` ? Also would be nice if you could provide links to "previous suggestions". – zx8754 May 17 '17 at 20:25
  • 4
    Just avoid the issue by using `saveRDS()` and `readRDS()` -- same efficiency, but one filer per object. – Dirk Eddelbuettel May 17 '17 at 20:28
  • @zx8754 these are some proposals that did not work in my case: http://stackoverflow.com/a/8703024/7513555 and http://stackoverflow.com/a/15487508/7513555 – peny May 17 '17 at 20:33
  • @zx8754 one RData file can be loaded but not a second one.. – peny May 17 '17 at 20:35
  • Then maybe loop through RData, and use `load(); get the object; rm(); gc()` ? – zx8754 May 17 '17 at 20:41
  • @Dirk Eddelbuettel Even though I end up having too many RDS files at the end (because I am trying to retrieve many single objects), it seems to work fine so far. thanks! – peny May 17 '17 at 21:06

0 Answers0