0

I'm trying to iteratively maximize some functions with 24 parameters, maybe more in the future and I use the R function optim() and method BFGS multiple times. However, it is slow as in one itetation 1-2 minutes are needed. Is there any solution for faster caonvergence? Should I use another method?

My function is:

myFunc <- function (Woptim,free){
  
  W.mat<-matrix(rep(0,times=p*q),ncol=q,nrow=p)
  W.mat[free]<-Woptim 
  
  sum<-0
  for (i in 1:n){
    mat1<-V[,,i]%*%Sigma
    detmat1<-abs(det(mat1))
    fixed<-1/2*(log(detmat1)-sum(diag(mat1))-(m[,,i] %*% Sigma %*% as.matrix (m[,,i])) +q) `
    
    sum2<-0
    
    for (d in 1:p){
      val1<-W.mat[d,]%*%m[,,i]+W0[d,]
      val2<-W.mat[d,]%*%V[,,i]%*%as.matrix(W.mat[d,])
      
      sum2<- data[i,d]*(val1)-1/2*val2-log(1+exp(val1+1/2*val2))+sum2}
  sum<-(fixed+sum2)+sum
  }
  
  return(sum)
}

and the values that I use in one iteration:

n<-1000
p<-24
q=3
m<-array(rep(0,times=q*n),dim=c(q,1,n))
V<-array(diag(q),dim=c(q,q,n))
Sigma<-diag(q)
W0<-matrix(1,ncol=1,nrow=p)
free<-c(1:8,33:40,65:72)
data<-matrix(sample(0:1,replace=T),ncol=p,nrow=n)
tata
  • 11
  • 1
  • 2
    That would almost surely depend entirely on the details – Dason Apr 05 '21 at 17:18
  • You could try `method="Nelder-Mead"` and see if that helps (it will avoid the cost of computing derivatives by finite differences), but otherwise it's as @Dason says. – Ben Bolker Apr 05 '21 at 17:19
  • 2
    I've heard some recommend `optimx` (in it's own package) as an alternate - not sure if that would help here. But of course the biggest thing you can do to speed things up is streamline the code that calculates the objective function. A different optimization method may decrease the number of iterations needed, but improving your objective function is the way to reduce the time per iteration. – Gregor Thomas Apr 05 '21 at 17:22
  • @Dason details such as the type of data? – tata Apr 05 '21 at 17:25
  • 2
    Details such as a [mcve]. Ideally we need to know *everything* about what you're doing (code & data), hopefully boiled down to a manageable problem statement. This is hard if you have a complicated setup, but you need to try. https://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example . At the very least we need to know something about your objective function. – Ben Bolker Apr 05 '21 at 17:27
  • @BenBolker well my set-up is a little bit complicated. My obejctive function includes a double loop, one over my 25 parameters and one over my sample size which is ~1000 observations, and inside the loops, matrix multiplication is required. – tata Apr 05 '21 at 17:33
  • 3
    Rephrasing what @GregorThomas said: This isn't `optim`'s fault. The speed of one "iteration" is dependent on the thing you are optimizing. Make your objective functions faster. e.g., refactor your double loop to be vectorized operations, if possible. – DanY Apr 05 '21 at 17:34
  • This probably isn't going to be answerable unless you can do some profiling and boil things down. – Ben Bolker Apr 06 '21 at 18:39
  • I added a simplified version of my code. – tata Apr 06 '21 at 20:57

0 Answers0