1

I am running R version 3.3.2 x64 through R studio on 64-bit Windows 10 with 16 gb RAM.

My memory.limit() is [1] 16274 mb and I am using dummy.code() to create dummies for a data frame with 445,622 rows and 2,699 unique values that I am generating dummies from (so expecting a 445,622 x 2,699 matrix).

Just before calling dummy.code(), my memory.size() is [1] 160.97 mb. However, I am getting the typical memory error:

Error: cannot allocate vector of size 9.0 Gb

My obvious question is why that is the case since 16274 - 160.97 > 9.0 Gb. I am suspecting that "there may not be a large enough contiguous block of address space available into which to map it" but this explanation is given here for 32-bit builds.

Also, I found out that when initialising a zero matrix with these dimensions,

m <- matrix(0, 445622, 2699)

it is created without problems so this does not seem to be the reason.

> memory.size()
[1] 9204.29

But this makes me wonder whether there might be a way of using a pre-allocated matrix to put the dummies and then use them in my lm model perhaps?

Any mistakes or misconceptions identified in my logic, please point out..

Tony
  • 781
  • 6
  • 22
  • Read the opening of the linked answer carefully. The solution to this problem is generally just to rethink your approach or to acquire computing resources with more RAM. – joran May 01 '17 at 17:05

0 Answers0