0

I'm running a spatial error model on a large dataset (n=26,000) for a hedonic price analysis. I have built a nearest neighbor (k=10) spatial weights file and listw object. However, when I try running the actual "errorsarlm" function, I get the following error: "Error: cannot allocate vector of size 5.1 Gb". I suspect this has to do with the large spatial weights file that is being created, but I haven't found a way around it.

I have already tried: (1) Clearing out my global environment (2) Reducing the number of columns in my original data frame to the bare minimum (3) Reducing the number of nearest neighbors to 5 (4) Increasing my memory limit (with the function: memory.limit(size=56000))

step1_knn_CONDO20 <- knearneigh(cbind(CONDO20$POINT_X, CONDO20$POINT_Y), k=10) 
step2_nb_ONDO20 <- knn2nb(step1_knn_CONDO20) 
step3_listw_CONDO20 <- nb2listw(step2_nb_CONDO20) 
CONDO_SEM_17_TEST <- errorsarlm(tol.solve=1e-20, formula = saleamount_num18LOG ~ var1 + var2 + var3, data = CONDO20, step3_listw_CONDO20)
emereif
  • 1
  • 1
  • 1
  • You are running out of memory. Shrink the size of your data, add more memory to your computer, consider cloud computing. These are the first order solutions. Maybe your data / analysis could work with the `big.memory` package or `ff` which can stored data on the hard drive. – lmo Mar 27 '19 at 00:16
  • See [this post](https://stackoverflow.com/questions/5171593/r-memory-management-cannot-allocate-vector-of-size-n-mb?rq=1) for more details on this issue. – lmo Mar 27 '19 at 00:18

0 Answers0