I have troubles using shgo.shgo
(I would say, just the same as scipy.optimize.shgo
except for an annoying bug in the latter).
Now, my optimization problem behaves badly when using SLSQP
as the local minimizer, and I found various suggestions for switching to COBYLA
.
It looks like there are some known issues in passing bounds to COBYLA
, and I am addressing them as specified here and here.
Still, the algorithm keeps searching outside some parameters' boundaries, causing the code to stop running.
I could handle this behavior "manually" by imposing an np.inf
error whenever the optimizer tries to use out-of-boundaries values, but I am not sure this is the clever way to do things.
Currently, I am running something like this:
from shgo import shgo
parameter_bounds = [ (1, 10), (1, 10), (1, 10), (1, 10), (0.05, 1.1), (0.1, 1), (0.01421, 0.01431), (0.3, 1.7),]
cons = []
for factor in range(len(parbounds)):
lower, upper = parbounds[factor]
l = {"type": "ineq", "fun": lambda x, lb=lower, i=factor: x[i] - lb}
u = {"type": "ineq", "fun": lambda x, ub=upper, i=factor: ub - x[i]}
cons.append(l)
cons.append(u)
result = shgo(
func=objfun,
args=(data_moments, W, rng),
bounds=parameter_bounds, # SHGO *needs* bounds
minimizer_kwargs={'method': 'COBYLA', 'constraints': cons},
options={'disp': True},
)
Any suggestion would be greatly appreciated.