I am trying to use scipy
to solve a non-trivial optimisation. I have a 6 dimensional problem. The function I am trying to optimize is not linear.
I think that this non-linear optimization problem can be solved with scipy.optimize.minimize
, see https://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html#constrained-minimization-of-multivariate-scalar-functions-minimize The documentation shows that the constraints can be handled one by one by:
cons = ({'type': 'eq',
'fun' : lambda x: np.array([x[0]**3 - x[1]]),
'jac' : lambda x: np.array([3.0*(x[0]**2.0), -1.0])},
{'type': 'ineq',
'fun' : lambda x: np.array([x[1] - 1]),
'jac' : lambda x: np.array([0.0, 1.0])})
In my case I need 12 constraints: one max and one min for each dimension. I also want something that I could generalise to more or fewer dimensions, so I do not want to type all the constraints by hand. Assuming that my mins and maxs are in two arrays. If I use a loop such as:
for i in range(6):
grad_min = np.zeroes(6)
grad_max = np.zeroes(6)
grad_min[i] = +1.
grad_max[i] = -1
cons_min = {'type': 'ineq',
'fun' : lambda x: np.array([x[i] - min[i]),
'jac' : lambda x : grad_min})
cons_max = {'type': 'ineq',
'fun' : lambda x: np.array([-x[i] + max[i]),
'jac' : lambda x : grad_max})
cons = cons + (cons_min,cons_max)
But because of the lack of closure of lambda
functions in Python this would fail, only the last (i=5
) would be enforced. I am looking for another design where I can defined all constraints simultaneously, I need to use something with closure..