I am trying to build a large inverse-distance matrix with 227,973 coordinates. Ultimately, I wish to use the results of this as the spatial weights matrix input to a spatial durbin model.
This is my code:
dist_matrix <- as.matrix(dist(cbind(full_df$Longitude,full_df$Latitude)))
dist_matrix_inv <- 1/dist_matrix
diag(dist_matrix_inv) <- 0
ilw <- mat2listw(dist_matrix_inv, style="W")
However, I am getting this error on the first line of the code:
Error: vector memory exhausted (limit reached?)
I am assuming that the problem here lies with storing the large matrix in RAM?
As such, I have tried increasing the memory limit that R has placed, referencing some solutions here: Error: vector memory exhausted (limit reached?) R 3.5.0 macOS -- particularly with setting the parameter: R_MAX_VSIZE=16Gb
However, I still encounter the same error.
Someone has suggested solving submatrices first and then piecing them together at the end to obtain the large matrix here: How to create a Large Distance Matrix?
However, I am not sure how to do that in code. I am also unsure if this would solve the problem of storing a large matrix in the RAM (assuming that this is the problem).
Can anyone advise?
Here is my system, if this is useful to know:
R version 3.5.2 (2018-12-20)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Mojave 10.14.5
Thank you!