I would like to use scipy.optimize to minimize a function (eventually non-linear) over a large set of linear inequalities. As a warm-up, I'm trying to minimize x+y over the box 0<=x<=1, 0<=y<=1. Following the suggestion of Johnny Drama below, I am currently using a dict-comprehesion to produce the dictionary of inequalities, but am not getting the expected answer (min value=0, min at (0,0)).
New section of code (currently relevant):
import numpy as np
from scipy.optimize import minimize
#Create initial point.
x0=[.1,.1]
#Create function to be minimized
def obj(x):
return x[0]+x[1]
#Create linear constraints lbnd<= A*(x,y)^T<= upbnd
A=np.array([[1,0],[0,1]])
b1=np.array([0,0])
b2=np.array([1,1])
cons=[{"type": "ineq", "fun": lambda x: np.matmul(A[i, :],x) -b1[i]} for i in range(A.shape[0])]
cons2=[{"type": "ineq", "fun": lambda x: b2[i]-np.matmul(A[i, :], x) } for i in range(A.shape[0])]
cons.extend(cons2)
sol=minimize(obj,x0,constraints=cons)
print(sol)
Original version of question:
I would like to use the LinearConstraint object in scipy.optimize, as described in the tutorial here: "Defining linear constraints"
I've tried to do a simpler example, where it's obvious what the answer should be: minimize x+y over the square 0<=x<=1, 0<=y<=1. Below is my code, which returns the error "'LinearConstraint' object is not iterable", but I don't see how I'm trying to iterate.
EDIT 1: The example is deliberately over simple. Ultimately, I want to minimize a non-linear function over a large number of linear constraints. I know that I can use dictionary comprehension to turn my matrix of constraints into a list of dictionaries, but I'd like to know if "LinearConstraints" can be used as an off-the-shelf way to turn matrices into constraints.
EDIT 2: As pointed out by Johnny Drama, LinearConstraint is for a particular method. So above I've tried to use instead his suggestion for a dict-comprehension to produce the linear constraints, but am still not getting the expected answer.
Original section of code (now irrelevant):
from scipy.optimize import minimize
from scipy.optimize import LinearConstraint
#Create initial point.
x0=[.1,.1]
#Create function to be minimized
def obj(x):
return x[0]+x[1]
#Create linear constraints lbnd<= A*
#(x,y)^T<= upbnd
A=[[1,0],[0,1]]
lbnd=[0,0]
upbnd=[0,0]
lin_cons=LinearConstraint(A,lbnd,upbnd)
sol=minimize(obj,x0,constraints=lin_cons)
print(sol)