1

I have a large ffdf object in R. It contains x and y values, and each column has 71,998,512 values.

I am trying to apply biglm function in biglm package as below

dat <- ffdf(Back = Pitch_Back$V1,  Head = Pitch_Head$V1)            
fit_linear <- biglm(Back ~ Head, data = dat)

But I came across the memory limit error as below;

Error: cannot allocate vector of size 549.3 Mb

How can I deal with it?

imtaiky
  • 191
  • 1
  • 12
  • the not so serious - but totally pragmatic answer is that with 2 variables and 77 million points if you sample about 1% of your data your model will be about 99.999 ...% similar to the model built on all the data. – Stephen Henderson Jun 12 '19 at 09:07
  • 1
    the serious secondary answers are mostly to be found to this previously question: https://stackoverflow.com/q/1358003/1527403 – Stephen Henderson Jun 12 '19 at 09:20
  • Thank you very much. I am going to try it. – imtaiky Jun 12 '19 at 12:29

0 Answers0