I have a fairly large raster file with a resolution of 0.9 arcsec x 0.9 arcsec carrying values between 0 and 100 (and 255 for NA) called forest1. I want to aggregate this raster to the resolution of another raster (called dfr_2010_crop) which has a 0.5° x 0.5° resolution using the mean of values. Unfortunately, the strategy I am currently using requires too much memory. Namely, I am using
TreeCover = rasterToPoints(forest1, fun=NULL, spatial=T)
TreeCoverPercent <- rasterize(TreeCover, dfr_2010_crop, fun=function(x,...) {sum(x, na.rm=T)/(4*10^4)}, field=g )
whereby g is the correct field I have saved before. 4*10^4 is the number of 0.9 arcsec x 0.9 arcsec in a 0.5° x 0.5° cell. R tells me that he cannot allocate a vector of the size of 7.9GB after running the first line. I have tried to solve this problem in the following ways:
rasterOptions(maxmemory=1e+08)
And, after this did not work I have tried to work in blocks. I tried following the approach given here ([https://strimas.com/post/processing-large-rasters-in-r/][1]) where they use the calc() function when working in blocks. However, I failed to customite it to my setting as I do not know how to call the blocks as raster files inside of the loop. However, here is my try:
canProcessInMemory(forest1, 1, TRUE)
#working in block
f_in <- f
f_out <- tempfile(fileext = ".tif")
# input and output rasters
r_in <- stack(f_in)
r_out <- raster(r_in)
# blocks
b <- blockSize(r_in)
print(b)
r_in <- readStart(r_in)
r_out <- writeStart(r_out, filename = f_out)
# loop over blocks
for (i in seq_along(b$row)) {
# read values for block
# format is a matrix with rows the cells values and columns the layers
v <- getValues(r_in, row = b$row[i], nrows = b$nrows[i])
# mean cell value across layers
v <- rasterToPoints(v, fun=NULL, spatial=T)
# write to output file
r_out <- writeValues(r_out, v, b$row[i])
}
# close files
r_out <- writeStop(r_out)
r_in <- readStop(r_in)
Looking forward to any suggestions and thanks for your help.