6

I am getting some very weird results when running the minimize function from scipy optimize.

Here is the code

from scipy.optimize import minimize

def objective(x):
    return - (0.05 * x[0] ** 0.64 + 0.4 * x[1] ** 0.36)

def constraint(x):
    return x[0] + x[1] - 5000

cons = [{'type':'eq', 'fun': constraint}]

when running

minimize(objective, [2500.0, 2500.0], method='SLSQP',  constraints=cons)

i get allocation 2500 for each element of x. with fun: -14.164036415985395

With a quick check, this allocation [3800, 1200] gives -14.9

It is highly sensitive to the initial conditions also .

Any thoughts as to what i am doing wrong

the two functions plotted

UPDATE It actually returns the initial conditions.

If i try this though

def objective(x):
    return - (x[0] ** 0.64 + x[1] ** 0.36)

def constraint(x):
    return x[0] + x[1] - 5000.0

cons = [{'type':'eq', 'fun': constraint}]

minimize(objective, [2500, 2500], method='SLSQP', constraints=cons)

returned results seem to be just fine (i have changed the objective function)

dimitris_ps
  • 5,849
  • 3
  • 29
  • 55
  • FYI: Before you look at the parameters returned (i.e. `result.x`), you should first check `result.success`. Is that value True or False? – Warren Weckesser Apr 21 '19 at 23:59
  • @warren-weckesser thanks for the comment. Both have success = True – dimitris_ps Apr 22 '19 at 00:08
  • With a grain of salt: while strange at first (assuming the solver will find a local-optimum, which it should) this looks like it's not a convex minimization problem (indefinite hessian) and therefore the guarantees of this solver are not enough to provide a global optimum. For tiny problems like this, global solvers like *couenne* should work. (and two new tags for one question; not sure if that helps) – sascha Apr 22 '19 at 00:17
  • If I change the method to `method='trust-constr'`, the result is `[3901.68798782, 1098.31201218]`, which agrees with what I see in a plot of the objective function. I haven't looked into the details of scipy's minimizers. @sascha's comment might explain the problem--the behavior might be the result of applying the SLSQP algorithm to a problem for which it is not designed. If possible, though, it would be nice if the algorithm could detect that something went wrong and raise an exception. Could you report this example in an issue at https://github.com/scipy/scipy/issues? – Warren Weckesser Apr 22 '19 at 00:40
  • @WarrenWeckesser thanks again for looking into it.I haven't looked in t the `method='trust-constr` but when running that i get `Unsupported jac definition.` i will report the issue, sure – dimitris_ps Apr 22 '19 at 00:53
  • i have version 1.1, It should be fixed when i update – dimitris_ps Apr 22 '19 at 00:57
  • version 1.2.1 now i get this `array must not contain infs or NaNs` for `method='trust-constr'` for the second example. – dimitris_ps Apr 22 '19 at 01:01
  • 1
    I also used 1.2.1. I didn't try the second example. With the first example, I forgot that I had changed the initial guess to [3000, 2000], so it was bit closer to the minimum. That call finished with no errors or warnings. If I use [2500, 2500] for the initial guess, I get a warning: `[...]/_hessian_update_strategy.py:187: UserWarning: delta_grad == 0.0. Check if the approximated function is linear. If the function is linear better results can be obtained by defining the Hessian as zero instead of using quasi-Newton approximations.` but the answer is still correct. – Warren Weckesser Apr 22 '19 at 01:08
  • Apparently for `method=trust-constr` there needs to exist an interior solution. i.e. with just the simple `def objective(x): return - (x[0] ** 2 + x[1])` i get an error thanks @WarrenWeckesser – dimitris_ps Apr 22 '19 at 01:25

0 Answers0