I’m trying to implement parallel computing in an R package that calls C from R with the .C
function. It seems that the nodes of the cluster can’t access the dynamic library. I have made a parallel socket cluster, like this:
cl <- makeCluster(2)
I would like to evaluate a C function called valgrad
from my R package on each of the nodes in my cluster using clusterEvalQ
, from the R package parallel. However, my code is producing an error. I compile my package, but when I run
out <- clusterEvalQ(cl, cresults <- .C(C_valgrad, …))
where …
represents the arguments in the C function valgrad
. I get this error:
Error in checkForRemoteErrors(lapply(cl, recvResult)) :
2 nodes produced errors; first error: object 'C_valgrad' not found
I suspect there is a problem with clusterEvalQ
’s ability to access the dynamic library. I attempted to fix this problem by loading the glmm package into the cluster using
clusterEvalQ(cl, library(glmm))
but that did not fix the problem.
I can evaluate valgrad
on each of the clusters using the foreach
function from the foreach R package, like this:
out <- foreach(1:no_cores) %dopar% {.C(C_valgrad, …)}
no_cores
is the number of nodes in my cluster. However, this function doesn’t allow any of the results of the evaluation of valgrad
to be accessed in any subsequent calculation on the cluster.
How can I either
(1) make the results of the evaluation of valgrad
accessible for later calculations on the cluster or
(2) use clusterEvalQ
to evaluate valgrad
?