0

I'm using the R package kuenm to produce and project species distribution models.

I've produced the models without a problem, but when I try to evaluate the extrapolation risk for future projections with the function kuenm_mop I get the error:

Error: cannot allocate vector of size 92GB

The system I'm using has Windows 8.1 Pro and 64GB of RAM (which I believe is the limiting facotr here).

My question is: is it possible to work with a vector of greater size than my RAM?

This is the function I'm using:

library(kuenm)

sets_var <- "Set_1" #set of variables used
out_mop <- "MOP_results" #output directory
percent <- 10
paral <- FALSE
is_swd <- FALSE
M_var_dir <- "M_variables"
G_var_dir <- "G_variables"


kuenm_mmop(G.var.dir = G_var_dir, M.var.dir = M_var_dir, sets.var = sets_var, is.swd = is_swd, out.mop = out_mop, percent = percent, parallel = paral)
Spectron
  • 11
  • 1
  • 2
    Well, it is *possible* but not practical. You would need to rewrite significant parts of the package code. Personally, I would rent an Amazon EC2 instance with significantly more RAM. However, keep in mind that those 92 GB are not necesseraly the whole memory demand. The error occured when an additional 92 GB were requested but there was already some memory in use and subsequent steps might need even more memory. – Roland Aug 17 '21 at 10:55
  • I'm not familiar with that package or function, but ... in other packages/functions, that comes up when there is an "explosion" of comparisons/expansions or an uncontrolled join. For instance, in a [recent answer](https://stackoverflow.com/a/68803842/3358272) of mine, the same error is seen because `data.table::transpose` was doing a lot to a large frame that normally fits quite easily in memory. If this is possible, you may need to reach out to the author (or at least find somebody familiar with the function ... again, I am not, so this is just my two cents). – r2evans Aug 17 '21 at 12:21

0 Answers0