0

I have run an R code on my local machine and it is sometimes successful and sometimes crashes my machine. In order to make this more successful I stood up a VDI with twice the RAM. When I run the code in this new environment it gives me the "Cannot allocate..." error every time in the same place even though I can sometimes get this to run on my local machine. Is there some specific way I need to set this up so that it takes advantage of all of the RAM? I ran memory.size() and it shows 32 GB. Any advice helps.

I have already tried removing and reinstalling R and R-Studio. It seems like when I run it on my local machine it maxes out the memory but when I run it on the VDI it give up the second the memory gets too high. Not sure why that would be the case. The amount that it can't allocate is also less than 1 GB so I can't imagine why this machine can't handle it.

  • Welcome to SO! This site is about programming problems, so as such needs to have programming. The root of this problem could be in the OS, version of R, packages you're using, code you're using (just blind luck that it's failing more now), or perhaps other things. With what you've given, we do not know enough to really speculate besides that. Please consider making this question reproducible: sample code, sample data, and expected output. Refs: https://stackoverflow.com/questions/5963269, https://stackoverflow.com/help/mcve, and https://stackoverflow.com/tags/r/info. – r2evans Aug 16 '19 at 15:23
  • The fail seems to happen when I try to do a `left_join` (from package `data.table`) between a table with 121,125,618 obs of 9 variables and a table with 18,633 obs of 15 variables. Both machines are running Windows 10 Pro, and I am using R-3.6.1 for both. – BusinessAnalytics Aug 16 '19 at 15:41
  • Please read the links, particularly the first if none others. BTW: `left_join` is from `dplyr`, not `data.table`. If you are using `data.table` and are banking on its by-reference semantics, you should be using its native join mechanisms (see https://github.com/Rdatatable/data.table/wiki) or `merge` (`data.table:::merge.data.table` exists, so `merge` will be "efficient" if the left frame is a `data.table`). – r2evans Aug 16 '19 at 15:57
  • I don't have a solution but I experienced the same problem. I had data.table joins that could run on my 16GB i7 machine if there was nothing else running but when I tried it in a 32GB Xeon VM it would always run out of memmory. – Fino Aug 16 '19 at 17:16

0 Answers0