I have R running on Windows 10 machine with 16gb of RAM. Task manager shows that 13.7 of that is available when I start running RStudio. Then I load the dataset (which has about 1o million rows) and still have 11.8 gb of free RAM. I then use "lm" to run a regression with 35 independent variables. Task Manager shows that the memory usage rises to 14gb after about 30 seconds and then I get this error
Error: cannot allocate vector of size 4.1 Gb
This seems very strange to me since, first of all, there are much more memory available to R than the said 4,1 Gb. Second, 10 million observations and 35 variables is not even considered big data and I don't understand why R has difficulty dealing with that. SAS would handle that in a second with no problem.
Do you know why I get this error, and what I can do to solve it?