0

I am trying to run the ExtremeBound package on R and it is crashing when I run it because the memory seems to be too small...

Here is the error message:

Error: cannot allocate vector of size 2.6 Gb
In addition: Warning messages:
1: In colnames(vif.satisfied) <- colnames(include) <- colnames(weight) <- colnames(cdf.mu.generic) <- vars.labels :
  Reached total allocation of 16296Mb: see help(memory.size)
2: In colnames(vif.satisfied) <- colnames(include) <- colnames(weight) <- colnames(cdf.mu.generic) <- vars.labels :
  Reached total allocation of 16296Mb: see help(memory.size)
3: In `colnames<-`(`*tmp*`, value = c("(Intercept)", "US_CPI", "UK_CPI",  :
  Reached total allocation of 16296Mb: see help(memory.size)
4: In `colnames<-`(`*tmp*`, value = c("(Intercept)", "US_CPI", "UK_CPI",  :
  Reached total allocation of 16296Mb: see help(memory.size)
5: In `colnames<-`(`*tmp*`, value = c("(Intercept)", "US_CPI", "UK_CPI",  :
  Reached total allocation of 16296Mb: see help(memory.size)
6: In `colnames<-`(`*tmp*`, value = c("(Intercept)", "US_CPI", "UK_CPI",  :
  Reached total allocation of 16296Mb: see help(memory.size)

I have allocated 800GB of Hardware as RAM (which is 16GB installed) and my Windows 10 Computer is working on an Intel i7.

How can I tell R to use the additional allocated RAM? I looked up other questions but the answers redirect to the use of packages that seem to make the entire matter more complicated.

Also, here is my sessionInfo():

R version 3.2.2 (2015-08-14)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 8 x64 (build 9200)

locale:

[1] LC_COLLATE=English_United Kingdom.1252  LC_CTYPE=English_United Kingdom.1252    LC_MONETARY=English_United Kingdom.1252
[4] LC_NUMERIC=C                            LC_TIME=English_United Kingdom.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] ExtremeBounds_0.1.5.1

loaded via a namespace (and not attached):
[1] tools_3.2.2   Formula_1.2-1

Thank you very much!

Sam

Community
  • 1
  • 1
Sam Vigne
  • 9
  • 1
  • 3
  • Did you compile in 32-bit or 64-bit mode? – NathanOliver Apr 04 '16 at 15:53
  • Thank you Nathan. My Computer runs on 64-bit – Sam Vigne Apr 04 '16 at 16:08
  • 1
    `cannot allocate vector of size 2.6 Gb` means additional 2.6 GB. It doesn't mean it doesn't use all the RAM. What other indications do you have that it doesn't? I suspect your are trying to do something beyond what the package author imagined when designing the package. It seems like you run out of memory when R tries to assign column names, which can be done much more efficiently with some other packages which you find too complicated. However, with a problem of this size, you are beyond easy. – Roland Apr 04 '16 at 16:24
  • Dear Roland, Thank you for your answer. Well if I run the Task Manager and look at the RAM utilisation, it jumps up to 16GB (my installed RAM) and then crashes. It seems like the system doesn't even attempt to use the virtual RAM allocated... You might be right with a limitation of the package, but it does stop as soon as it reaches 16GB RAM... – Sam Vigne Apr 04 '16 at 16:42
  • What exactly does "crashes" mean? A segfault? – Roland Apr 05 '16 at 06:49
  • Hi Roland, Thanks for getting back to me. R stops running the regression and indicates it "cannot allocate vector of size 2.6 Gb"... I allocated 800GB of virtual RAM of my C: drive though... I reckon I need to tell R to use that additional virtual RAM but I am completely clueless on how to do that... I tried the memory.size and set the memory.limit to a high value but it doesn't seem to help... – Sam Vigne Apr 05 '16 at 17:09
  • Possible duplicate of [R memory management / cannot allocate vector of size n Mb](http://stackoverflow.com/questions/5171593/r-memory-management-cannot-allocate-vector-of-size-n-mb) – Robert Gowland Apr 15 '16 at 20:46

0 Answers0