I am currently working with a dataset of approximately 4 million data points. I am using R in Rstudio on a Macbook Pro (32gb Ram, 2.2GHz Intel Core i7, 155 available memory on the Hard Drive).
My goal is to perform a non-linear mixed effect regression on the data. the data has two random effects which are nested and the model needs varying slopes as well as intercepts.
My code for this model is:
model <- lmer(DV ~ I(IVr^2) + IV + (IV | group/episode), data = data, REML=FALSE)
However, when running the model, the process uses up ~ 42gb of RAM before crashing. The log is:
Vector memory exhausted (limit reached?)
I want to manipulate R in someway so that it can run slower but use up my available hard drive memory so that it can handle this run. The closest solution I have found is the biglm package in R, however, I cannot find an equivalent for lmer(). I also can't find much on manipulating swap space on a Mac. Any solutions to the problem welcome.
Alternatively, I speak python so a solution using python instead would be great. (i.e. a module which can handle polynomial mixed effect models and a module to sort out my memory issue)