I am looking for an optimization method in scipy which allows me to minimize object function f(x,y) (returns vector) subject to the constraint g(x,y) < 0.1 and additional bounds on x and y.
I have tried to solve my problem with scipy.optimize.least_squares, scipy.optimize.leastsq and scipy.optimize.minimize. The problem is that leastsq and least_squares allow the object function to be non-scalar, but does not give me the possibility of implementing a constraint (only bounds). minimize on the other hand gives me the possibility of implementing both a constraint and bounds, but f(x,y) must return a scalar. Hence, I am looking for a solution that combines both. Does anyone know whether something like this exists?
The function I want to minimize is
def my_cost(p,f_noise):
x,y = p[0], p[1]
f = #some function that returns a 3x1 array
return (f - fnoise)**2
I did this with the least_squares method.
opti.least_squares(my_cost, p0[:], args = (f_noise,),gtol=1e-2, bounds=bounds)
But here I have the problem that I cannot constrain the variables in p. I need to constrain p so that it fulfils
def constraint(p)
x = p[0]
return fy(x) - y <= 0.1 #variable y therefore becomes a function of variable x
To implement the constraint, I tested scipy's minimize function
opti.minimize(my_cost, p0[:], args = (f_noise,), bounds = bounds, constraints={'type': 'eq', 'fun': constraint})
But here I can't seem to find a way to allow my_cost and f_noise to be 3x1 arrays.
For any help I am very grateful! Cheers for your time!