1

I want to calculate the trend of a raster stack in R, using daily data for 50 years for the whole of Europe.

time <- 1:nlayers(gimms.sum) 
fun=function(x) { if (is.na(x[1])){ NA } else { m = lm(x ~ time); summary(m)$coefficients[2] }}
gimms.slope=calc(gimms.sum, fun)

The code above was taken from here [1]. However, I get the message that the vector is 8.6 Gb and cannot allocate the memory

[1] https://matinbrandt.wordpress.com/2013/11/15/pixel-wise-time-series-trend-anaylsis-with-ndvi-gimms-and-r/

How could I deal with this issue?

  • are you able to display some summary stats for `gimms.sum` & `x`? e.g. with `summary` function to get dimensions. The operations you're asking are likely too big for your computer to process. One strategy would be to start with less data ie not 50 years, see if the codes work on a smaller set first. If so, then you have 3 paths: (1) bigger machine; (2) see if you can run code in batches over smaller sets of data and bind together later (not always possible); (3) see if any of the code is memory inefficient and improve it. Try profvis package for this – Jonny Phelps Nov 03 '18 at 12:45
  • also, https://stackoverflow.com/questions/1395229/increasing-or-decreasing-the-memory-available-to-r-processes – Jonny Phelps Nov 03 '18 at 12:46
  • memory.limit() ? – Dr. Flow Nov 03 '18 at 14:46
  • Please take the time to review answers to your question and feed-back comments and mark accepted answers. – Léa Gris Jan 18 '20 at 00:41

1 Answers1

0

I assume you are using raster version 2.7-15. There is a mistake in a memory management setting in that version. You can fix that with rasterOptions(maxmem=1e09); or use version 2.8-4 (reached CRAN today, but not compiled for win and mac yet)

Robert Hijmans
  • 40,301
  • 4
  • 55
  • 63