1

So I have 2 custom Rcpp functions CustFunc1(x,y) and CustFunc2(a,b). Both are computationally demanding (thus the c++). My question is there a way in R to run the concurrently with the dopar package? Or can I do a system call to directly access the cpp code and try to do parallel via some command line tools?

Right now the flow is:

result1=CustFunc(x,y) ##Takes 20 minutes
result2=CustFunc(a,b) ## Take 20 minutes

get_both <- function(x) {
foreach(i = seq_along(x)) %dopar% {
result1=CustFunc(x,y)
result2=CustFunc(a,b)
}
}
get_both$result1 == result1 #???
get_both$result2 == result2 ##??
K.J.J.K
  • 429
  • 5
  • 12
  • 1
    The best thing to do is roll the functions into a package, then you should be able to use the functions in parallel no problem. See, for example, [this answer](https://stackoverflow.com/a/38518629/8386140) to a related question on Stack Overflow, or [this blog post](https://blog.revolutionanalytics.com/2018/01/parallelize-rcpp.html) – duckmayr Mar 09 '19 at 02:23

1 Answers1

0

If your functions take on the order of 20 min I would really consider rewriting them using the RcppPaallel package, it isn’t too much harder than straight cpp. That way you can use more threads, but run your functions one at a time. When I did this with my own code, I cut my computation time down from about 15 min to less than 2.

skatz
  • 115
  • 7