I have a large list of objects (say 100k elements). Each element will have to be processed by a function "process" BUT I would like to do the processing in chunks... say 20 passes for example as I want to save processing results into a hard drive file and keep operating memory free.
I'm new to R and I know that it should involve some apply magic but I don't know how to do it (yet).
Any guidance would be much appreciated.
A small example:
objects <- list();
for (i in 1:100){
objects <- append(objects, 500);
}
objects;
processOneElement <- function(x){
x/20 + 23;
}
I would like to process first 20 elements in one go and save results then process second 20 elements in second go and save results... and so on
objects <- list();
for (i in 1:100){
objects <- append(objects, 500);
}
objects;
process <- function(x){
x/20 + 23;
}
results <- lapply(objects, FUN=process)
index <- seq(1, length(objects), by=20);
lapply(index, function(idx1) {
idx2 <- min(idx1+20-1, length(objects));
batch <- lapply(idx:idx2, function(x) {
process(objects[[x]]);
})
write.table(batch, paste("batch", idx1, sep=""));
})