Let's assume to have a large dataset of climatic data at monthly time steps for a large number of points in the world. Then dataset is shaped as a data.frame
of the type:
lon, lat, data_month_1_yr_1, ..., data_month_12_yr_100
Example:
set.seed(123)
data<- data.frame(cbind(runif(10000,-180,180), runif(10000,-90,90))
, replicate(1200, runif(10000,0,150)))
I would like to perform a Mann-Kendall test (usingtrend::mk.test
) over the monthly time series of each of the spatial points and get the main statistics in a data.frame
. In order to speed up this very long process I parallelized my code and wrote something like the following:
coords<-data[,1:2] #get the coordinates out of the initial dataset
names(coords)<-c("lon","lat")
data_t<- as.data.frame(t(data[,3:1202])) #each column is now the time series associated to a point
data_t$month<-rep(seq(1,12,1),100) # month index as last column of the data frame
# start the parallel processing
library(foreach)
library(doParallel)
cores=detectCores() #count cores
cl <- makeCluster(cores[1]-1) #take all the cores minus 1 not to overload the pc
registerDoParallel(cl)
mk_out<- foreach(m=1:12, .combine = rbind) %:%
foreach (a =1:10000, .combine = rbind) %dopar% {
data_m<-data_t[which(data_t$month==m),]
library(trend) #need to load this all the times otherwise I get an error (don't know why)
test<-mk.test(data_m[,a])
mk_out_temp <- data.frame("lon"=coords[a,1],
"lat"=coords[a,2],
"p.value" = as.numeric(test$p.value),
"z_stat" = as.numeric(test$statistic),
"tau" = as.numeric(test$estimates[3]),
"month"= as.numeric(m))
mk_out_temp
}
stopCluster(cl)
head(mk_out)
lon lat p.value z_stat tau month
1 -76.47209 -34.09350 0.57759040 -0.5569078 -0.03797980 1
2 103.78985 -31.58639 0.64436238 0.4616081 0.03151515 1
3 -32.76831 66.64575 0.11793238 1.5635113 0.10626263 1
4 137.88627 -30.83872 0.79096910 0.2650524 0.01818182 1
5 158.56822 -67.37378 0.09595919 -1.6647673 -0.11313131 1
6 -163.59966 -25.88014 0.82325630 0.2233588 0.01535354 1
This runs just fine and gives me exactly what I am after: a matrix reporting the M-K statistics for each combination of coordinates and month. Although the process is parallelized, however, the computation takes still a considerable amount of time.
Is there a way to speed up this process? Any room for using functions from the apply
family?