2

I have a large *.Rdata file of size 15 Gb (15'284'533'248 Bytes) created in RStudio on a MacBook Pro with 8 Gb RAM, containing several lists of dataframes.

Now I want to load() the file into RStudio on my PC with 32 Gb RAM, but only the RAM swells beyond all measure and at the end I get this:

Error: cannot allocate vector of size 78 Kb

The comic is, when I reload it on the Mac it works totally fine.

What's going wrong?

[Edit1] RStudio 1.0.136 on Mac, RStudio 1.1.383 on PC. Both R 3.4.2.

[Edit2] Screenshot of Mac which has 8GB RAM

enter image description here

jay.sf
  • 60,139
  • 8
  • 53
  • 110
  • 1
    How did you save the Rdata file (what params for compression/etc)? Also: there's no way a 15GB Rdata file loads into memory on your 8GB Mac. – hrbrmstr Oct 23 '17 at 13:44
  • Just `save(file1, file2, file3..., "data.Rdata")`. – jay.sf Oct 23 '17 at 13:45
  • 1
    so, that means compression level is 6, which very likely means the actual data ids hitting or over the memory limits for 32GB. And, there really is no way you had that in memory in your 8GB MacBook Pro. – hrbrmstr Oct 23 '17 at 13:47
  • I'm not that familiar what this mac exactly does and how, it's just as I stated. And the "about this mac" claims it has 8Gb. – jay.sf Oct 23 '17 at 13:54
  • 1
    Anyway, that size Rdata file isn't going to loaded into any system you have and wld likely need to be used on a system with 64GB of RAM to actually do analyses on it. I find it to be better in the long run to use individual RDS files and tgz them. – hrbrmstr Oct 23 '17 at 14:14
  • By `saveRDS()`I guess? – jay.sf Oct 23 '17 at 14:31
  • 3
    Looks like your Mac makes a lot more use of swap memory than your Windows. Not sure if I'm 100% right, but I think on Mac there is no real upper limit for its size. Windows does have a maximum space for that, see also https://superuser.com/questions/793304/how-to-increase-swap-memory-in-windows – Joris Meys Oct 24 '17 at 08:12

0 Answers0