I am trying to minimize a function using scipy.optimize.minimize
and I get the following errors:
Singular Jacobian matrix. Using SVD decomposition to perform the factorizations.
delta_grad == 0.0. Check if the approximated function is linear. If the function is linear better results can be obtained by defining the Hessian as zero instead of using quasi-Newton approximations.
My objective function is simply something around those lines:
def obj_func(x, x_new, idx, multipliers):
x_tot = pd.concat([pd.Series(x, index=trade_syms), x_unchanged])
obj = abs(x_tot.dot(multipliers))
return obj
I checked the following post and I added the hess
parameter in my optimize.minimize
function so that it now looks like this:
optimize.minimize(obj_func,
x0=x0,
args=(a1, a2, a3),
bounds=((-0.5, 0.5),) * Nc,
constraints=cons,
hess=lambda x1,x2,x3,x4: np.zeros((Nc, Nc)),
method='trust-constr')
But I still get the same error. My constraints are just 2 equality constraints:
cons = [{'type': 'eq', 'fun': func1},
{'type': 'eq', 'fun': func2}]
Also, I dont know why I need to add x1
,x2
,x3
,x4
to the lambda function. I did it because hess = lambda x: numpy.zeros((n, n))
gave me an error like Expected 4 arguments to unpack, but got only one
.
Anything I am missing here?