11

This question came at the right time, as I'm struggling with optimization as well. I am aware of the different "normal" optimization routines in R, and I am aware of parallel packages like snow, snowfall, Rmpi and the likes. Yet, I didn't manage to get an optimization running in parallel on my computer.

Some toy code to illustrate :

f <- function(x) sum((x-1:length(x))^2)
a <- 1:5
optim(a,f)
nlm(f,a)

What I want to do, is to parallelize the optim() function ( or the nlm() function, which does basically the same). My real function f() is a lot more complicated, and one optimization round lasts about half an hour. If I want to run a simulation of 100 samples, that one takes ages. I'd like to avoid writing my own Newton-like algorithm for parallel computing, so I hope somebody could give me some hints on how to use parallel computing for complex optimization problems in R.


I reckon this problem is of a different nature than the one in the related question. My request is specifically directed towards parallel computing, not some faster alternative for optim.

Community
  • 1
  • 1
Joris Meys
  • 106,551
  • 31
  • 221
  • 263
  • 1
    If your function is much more complicated, could you parallelize it instead of `optim`? – Joshua Ulrich Sep 21 '10 at 12:06
  • @Joshua : thx, there is indeed a bit of parallelization possible. Yet, I'd like to use some swarm method for optimization as well, as I have to increase speed at least another ten-fold for the model to be workable in simulations. – Joris Meys Sep 21 '10 at 12:10

3 Answers3

5

The R package optimParallel could be helpful in your case. The package provides parallel versions of the gradient-based optimization methods of optim(). The main function of the package is optimParallel(), which has the same usage and output as optim(). Using optimParallel() can significantly reduce optimization times as illustrated in the following figure (p is the number of paramters). enter image description here See https://cran.r-project.org/package=optimParallel and http://arxiv.org/abs/1804.11058 for more information.

Nairolf
  • 2,418
  • 20
  • 34
3

To answer my own question :

There is a package in development that looks promising. It has Particle Swarm Optimization methods and builds on the Rmpi package for parallel computing. It can be found on Rforge :

http://www.rforge.net/ppso/index.html

It's still in beta AFAIK, but it looks promising. I'm going to take a look at it later on, I'll report back when I know more. Still, I leave the question open, so if anybody else has another option...

Joris Meys
  • 106,551
  • 31
  • 221
  • 263
  • If you're considering PSO, have you thought about differential evolution (via the DEoptim package)? Parallel computing support is on the package's to-do list and shouldn't take more than a few hours of work (for me, not you :-). – Joshua Ulrich Sep 21 '10 at 14:58
  • @Joshua Thx for the tip, I didn't know DEoptim yet. It looks promising, but for the problem I'm working on now it's actually quite slower than nlm(). I've 13 parameters and no clear lower and upper limits, so I have to set them rather big to avoid missing a parameter... – Joris Meys Sep 21 '10 at 15:12
  • Tried the beta out and it seems to work. On my problem, it still doesn't provide the same improvement as parallelizing the function itself. Yet, I can see that in other cases this would really be a useful tool. I'm looking forward to the first stable release. – Joris Meys Sep 25 '10 at 18:39
0

Sprint might be of interest. I know nothing about it but stumbled across it recently.

High Performance Mark
  • 77,191
  • 7
  • 105
  • 161
  • Thx for the pointer, but I knew it already. There are more frameworks for parallel computing in R, depending on the protocols you want to use. Yet, I couldn't find a -non-beta- optimization function that uses the power of parallel computing. – Joris Meys Sep 21 '10 at 14:38