0

I have the following for loop

for( i in 1:No_Simulations) {
      Vec =   rowSums(  sweep(Matrix1,MARGIN=2,Matrix2[i,],`*`)  )
      if( i == 1 ) {  Result <- Vec } else {   Result = cbind(  Result , Vec )  } }

in which No_Simulations = 10000 and dim of Matrix2 is 10000 100 and dim of Matrix1 is also 10000 100.

I am not able to run this part of the code as R compiler gives me the following error : " impossible d'allouer un vecteur de taille 366.6 Mo "

i.e. unable to allocate a vector of size 366.6 MB. I have also tried using the memory.limit() to increase the memory size but it still doesn't work. Could someone please help me.

user1415530
  • 413
  • 1
  • 6
  • 11
  • 2
    Pre-allocate your `Result` matrix! You're asking for your code to take forever by growing an object (`cbind`) in a loop. – Joshua Ulrich Aug 31 '12 at 16:07
  • fwiw, to assign a vector of that size, you need **contiguous** memory, not just total. Preallocating your result matrix will likely solve this... if not, get more memory! – Justin Aug 31 '12 at 16:07

3 Answers3

6

I don't think this should be an answer, but I don't seem to be able to comment on your question (is my rep too low?):

I'm sure you've checked all the obvious issues with memory management:

  • Ensure that you physically have that much memory to allocate
  • Ensure that your process total is under the PROCESS addressable limit (only really a problem on 32bit now)
  • Ensure that enough memory is free to allocate a contiguous block of the size you requested. Memory fragments.

And by playing with the numbers, are you able to determine the limit at which you are no longer able to allocate the memory? Is it the same every time or does it change as you start/kill other processes (ie using differing amounts of memory)?

EDIT: See R memory management / cannot allocate vector of size n Mb

Community
  • 1
  • 1
im so confused
  • 2,091
  • 1
  • 16
  • 25
2

Unfortunately 32-bit operating systems have an inherent limit to the amount of RAM they use, and R won't use put objects onto your hard-drive and work from there. If you fill the RAM (usually when the function 'memory.size()' returns 1.8Gb), R returns a message stating it cannot allocate any more memory.

The bigmemory package is your best bet. This package allows you to use very large objects in R (with a speed hit). There are a load of examples that would be relevant to you in the bigmemory manual.

On top of that, there are some vanilla-R functions that help memory management in R:

  • memory.size(): returns information about the current (or max) memory usage in R
  • rm(): remove an R object from the workspace
  • gc(): calls the garbage collector, which reallocates wasted memory. Depending on the optional arguments given, the function also returns useful information about memory usage.

I won't patronise you by saying to only take samples of your data, but the approach is useful for some (not all) problems.

Hope that helps!

edit: a very useful function to find the memory size of an object in R: Tricks to manage the available memory in an R session .

Community
  • 1
  • 1
Róisín Grannell
  • 2,048
  • 1
  • 19
  • 31
  • 2
    You statement the R "locks up" when it is unable to allocate contiguous memory is ambiguous and misleading at best and entirely wrong at worst. It generally gives an informative error message and returns to the console prompt. That's not what I call "locking up". – IRTFM Aug 31 '12 at 20:32
0

You are doing this in a device that probably has a bunch of other programs and processes running. Close all your programs and R. Restart you computer and run only R, and run your code again. It will probably succeed without complaint.

IRTFM
  • 258,963
  • 21
  • 364
  • 487