0

I'm running a very simple code in R (using RStudio) that uses an already coded function.

When using the function, I get the classic error:

"Error: cannot allocate vector of size XX",

because one of the inputs is a "large" vector for the purposes of the function.

I have looked for solutions, but all point towards using memory.size() and memory.limit(). The problem is that I'm working in a server so those functions are not available (are only for Windows). Since I'm working on a server, in principle, I should have no problem with the memory (the available memory is far larger than the one R says cannot handle.

Any suggestions would be extremely useful, thanks!!

EDIT: this is the code:

rm(list = ls())

library(readstata13)
library(devtools)
library(csranks)
library(dplyr)

k1 <- read.dta13("k1.dta")

gc()

CS_simul <- cstauworst(k1$K1, k1$se1, tau=10, R=5, seed=101,  na.rm=TRUE)

cstauworst is a function that is contained in the library cranks. The data k1 is "small" (less than a MB, around 60k obs) but large for the purposes of the function. The algorithm requires using the whole data simultaneously so I cannot run it by piece or parallelize.

Phil
  • 7,287
  • 3
  • 36
  • 66
  • [This](https://stackoverflow.com/questions/5171593/r-memory-management-cannot-allocate-vector-of-size-n-mb) may be helpful. Also, I don’t know what you are dealing with but if it is for a modeling purpose, using a dimension reduction technique like PCA, SVD etc. may be useful as well. – maydin Jun 26 '20 at 19:20
  • Can you please add the code you are using, it may help understand why this error is being produced. – cddt Jun 26 '20 at 20:48
  • Thanks for the comments! I edited the post adding the code. I also followed the link suggested by maydin, but I still can't make the code run following their advice. – user590983 Jun 26 '20 at 21:08

0 Answers0