I am using the BatchJobs
package. I have a list of data.table
that I am trying to iterate over and submit jobs. However, i receive the following error message:
batchMap(reg, function(dtb) dtb[,roll_reg(.SD),by=gvkey], dtb_input_list)
Error: serialization is too large to store in a raw vector
Each data.table
is about 7,000 rows and 6 columns. I cannot seem to understand why this would be too large to serialize. Every search comes up with numbers of 2^31 - 1 as the limit. Here is an example:
require(BatchJobs)
reg <- makeRegistry(id="myreg", file.dir="myreg")
test_list = lapply(1:10, function(i) data.frame(a=rep(i,10000)))
batchMap(reg, function(dtb) nrow(dtb), test_list)
Error: serialization is too large to store in a raw vector
EDIT: Pending more investigation. This seems to be an error that comes in goes using the same exact data. Trying to understand what other environment variables are changing.