I'm at a total loss on this one. I have a large, though not unreasonable, matrix for my data frame in R (48000 * 19). I'm trying to use sm.ancova() to investigate the differential effect slopes, but got
error: cannot allocate vector of size 13.1GB
13GB overtaxed the memory allocated to R, I get that. But... what?! The entire CSV file I read in was only 24,000kb. Why are these single vectors so huge in R?
The ancova code I'm using is:
data1<-read.csv("data.csv")
attach(data1)
sm.ancova(s,dt,dip,model="none")
Looking in to it a bit, I used:
diag(s)
length(s)
diag(dt)
length(dt)
diag(dip)
length(dip)
Which all gave the same error. Their lengths are all 48000.
Any explanation would help. A fix would be better :)
Thanks in advance!
A dummy data link that reproduces this problem can be found at: https://www.dropbox.com/s/dxxofb3o620yaw3/stackexample.csv?dl=0