1

I'm trying to use the shgo algorithm to run simulations (black box problem) and maximize the output parameter of the simulation. The objective functions runs and evaluates the simulation. I have 5 variables as input. I need to define boundaries and constraints, which is needed to limit the geometry of the simulation. As this is a problem with a lot of variables I needed a global optimizer, which accepts boundaries and constraints. Therefore shgo seemed perfectly suitable. However, I am struggling to get the optimizer algorithm to accept my boundaries and constraints and to converge.

This is my code for the optimization:

bnds = [(50*1e-9,500*1e-9), (50*1e-9,500*1e-9), (1,20), (20*1e-9,80*1e-9), (250*1e-9,800*1e-9)]

def constraint1(x):
    return x[4]-50*1e-9-2*x[0] # x[4]<=2*x[0]-50nm(threshold) 
def constraint2(x):
    return x[1]-x[3]-20*1e-9 # x[1]-x[3]>=20nm(threshold)  
def constraint3(x):
    return x[0]-(x[1]/2)*(2.978/x[2])-20*1e-9

cons = ({'type': 'ineq', 'fun': constraint1},
        {'type': 'ineq', 'fun': constraint2},
        {'type': 'ineq', 'fun': constraint3})

minimizer_kwargs = {'method':'COBYLA',
                    'bounds': bnds,
                    'constraints':cons}   

opts = {'disp':True}

res_shgo =  shgo(objective, 
                 bounds=bnds, 
                 constraints=cons, 
                 sampling_method='sobol', 
                 minimizer_kwargs=minimizer_kwargs, 
                 options=opts)

The global algorithm runs for 33 rounds to complete the evaluations and starts the minimiser pool:

Evaluations completed.
Search for minimiser pool
--- Starting minimization at [3.3828125e-07 4.6484375e-07 1.1984375e+01 6.7812500e-08 7.5703125e-07]...

Now, the COBYLA Alorithm is used within the minimiser pool for the minimization. However, after a few rounds it exceeds the boundaries with the result, that the input parameter cause my simulation to crash.


I have also tried 'L-BFGS-B' algorithm for the minimizer pool.

minimizer_kwargs = {'method':'L-BFGS-B'}

The algo converged with the following statment:

lres =       fun: -20.247226776119533
 hess_inv: <5x5 LbfgsInvHessProduct with dtype=float64>
      jac: array([ 1.70730429e+09,  1.22968297e+09,  0.00000000e+00, -1.82566323e+09,
        1.83071706e+09])
  message: 'CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL'
     nfev: 6
      nit: 0
     njev: 1
   status: 0
  success: True
        x: array([2.43359375e-07, 2.99609375e-07, 1.48046875e+01, 7.01562500e-08,
       6.23828125e-07])
Minimiser pool = SHGO.X_min = []
Successfully completed construction of complex.

The result was not the global minimum though.

How can I make shgo terminate successfully preferably with the COBYLA.

Konrad Rudolph
  • 530,221
  • 131
  • 937
  • 1,214
loco39
  • 11
  • 1

3 Answers3

0

I think intermediate (infeasible) solutions may not obey bounds. (Other NLP solvers actually never call function evaluations without bounds being observed; that is a better approach. This means we can protect against bad evaluations using bounds.) Given that you have these out-of-bound function evaluations, you can try two things:

  1. Project variables onto their bounds before calling the simulator.
  2. If bounds are not obeyed, immediately return a large value and don't even call the simulator.
Erwin Kalvelagen
  • 15,677
  • 2
  • 14
  • 39
0

Thanks for your comment.

I added these few lines of code within the objective function to avoid calling the simulator.

def objective(x):
    global iteration
    if all(lower_bound <= variable <= upper_bound for variable, (lower_bound, upper_bound) in zip(x, bnds)):
        target_value = run_simulation(x)     
    else:
        target_value = ?

    return target_value

I'm not sure with value to pass to the optimization algorithm in the else-command, which does not interfere with the result of the optimizer. 'np.nan' or '0' do not work.

loco39
  • 11
  • 1
0

Ok...I solved the problem.

The problem was the boundaries with values very close to zero (10^-9). So I removed the 10^-9 and simply added it elsewhere in the script.

However,now the next problem has popped up:

The algo does a rough global with 8-10 iteration before starting the local minimization. I find this no quite enough as there are 5 input parameters. Furthermore the local minimization routine keeps 'digging' in the same spot for 20+ iteratins only adjusting the input parameters by less than 0,5 at a time.

My aim is to increase the number of global iteration to better cover the parameter range and therefore reduce the number of local iterations, where only small and therefore neglectable changes in the output occur. Or otherwise increase the step size for the local minimizer rounds.

I have tried different input variables of the shgo-algorithm, such as 'n', 'iters', 'maxfev', 'maxev' and 'f_tol'. No of them showing the desired result.

loco39
  • 11
  • 1