I'm running a very simple code in R (using RStudio) that uses an already coded function.
When using the function, I get the classic error:
"Error: cannot allocate vector of size XX",
because one of the inputs is a "large" vector for the purposes of the function.
I have looked for solutions, but all point towards using memory.size() and memory.limit(). The problem is that I'm working in a server so those functions are not available (are only for Windows). Since I'm working on a server, in principle, I should have no problem with the memory (the available memory is far larger than the one R says cannot handle.
Any suggestions would be extremely useful, thanks!!
EDIT: this is the code:
rm(list = ls())
library(readstata13)
library(devtools)
library(csranks)
library(dplyr)
k1 <- read.dta13("k1.dta")
gc()
CS_simul <- cstauworst(k1$K1, k1$se1, tau=10, R=5, seed=101, na.rm=TRUE)
cstauworst is a function that is contained in the library cranks. The data k1 is "small" (less than a MB, around 60k obs) but large for the purposes of the function. The algorithm requires using the whole data simultaneously so I cannot run it by piece or parallelize.