9

I am solving the following optimization problem:

enter image description here

with this Python code:

from scipy.optimize import minimize
import math

def f(x):
    return math.log(x[0]**2 + 1) + x[1]**4 + x[0]*x[2]

x0 = [0, 0, 0]

cons=({'type': 'ineq',
       'fun': lambda x: x[0]**3 - x[1]**2 - 1},
      {'type': 'ineq',
       'fun': lambda x: x[0]},
      {'type': 'ineq',
       'fun': lambda x: x[2]})

res = minimize(f, x0, constraints=cons)
print res

I am getting an error

message: 'Inequality constraints incompatible'

What can cause this error?

Max
  • 185
  • 1
  • 2
  • 12
  • 1
    FWIW, in another question https://stackoverflow.com/questions/55543140/minimize-quadratic-function-subject-to-linear-equality-constraints-with-scipy/55751247#55751247 I found lowering the tolerance to help `options={'ftol':1e-15}` but it doesn't seem to help here. But as in your case I did find that initial values do matter a lot and neither `SLSQP` or `trust-const` worked for me when initial values were too far away. – JohnE Apr 26 '19 at 19:50

2 Answers2

9

The issue seems to be with your initial guess. If I change your starting values to

x0 = [1.0, 1.0, 1.0]

Then your code will execute fine (at least on my machine)

Python 3.5.1 (v3.5.1:37a07cee5969, Dec 6 2015, 01:54:25) [MSC v.1900 64 bit (AMD64)] on win32

 message: 'Optimization terminated successfully.'
    njev: 10
     jac: array([ 1.,  0.,  1.,  0.])
     fun: 0.6931471805582502
     nit: 10
  status: 0
       x: array([  1.00000000e+00,  -1.39724765e-06,   1.07686548e-14])
 success: True
    nfev: 51
Cory Kramer
  • 114,268
  • 16
  • 167
  • 218
  • It works. Do you have any idea about the reason why it doesn't work with my x0? – Max Feb 16 '16 at 12:30
  • 4
    @Rextuz If I'm not mistaken, your initial guess doesn't satisfy the constraints, which would be a good reason the function fails. Yet, it seems Cory's initial guess doesn't satisfy the first constraint either. – rubenvb Feb 16 '16 at 12:32
1

Scipy's optimize module has lots of options. See the documentation or this tutorial. Since you didn't specify the method here, it will use Sequential Least SQuares Programming (SLSQP). Alternatively, you could use the Trust-Region Constrained Algorithm (trust-const).

For this problem, I found that trust-const seemed much more robust to starting values than SLSQP, handling starting values from [-2,-2,-2] to [10,10,10], although negative initial values resulted in increased iterations, as you'd expect. Negative values below -2 exceeded the max iterations, although I suspect might still converge if you increased max iterations, although specifying negative values at all for x1 and x3 is kind of silly, of course, I just did it to get a sense of how robust it was to a range of starting values.

The specifications for SLSQP and trust-const are conceptually the same, but the syntax is a little different (in particular, note the use of NonlinearConstraint).

from scipy.optimize import minimize, NonlinearConstraint, SR1

def f(x):
    return math.log(x[0]**2 + 1) + x[1]**4 + x[0]*x[2]

constr_func = lambda x: np.array( [ x[0]**3 - x[1]**2 - 1,
                                    x[0],
                                    x[2] ] )

x0=[0.,0.,0.]

nonlin_con = NonlinearConstraint( constr_func, 0., np.inf )

res = minimize( f, x0, method='trust-constr',
                jac='2-point', hess=SR1(),
                constraints = nonlin_con )

Here are the results, edited for conciseness:

    fun: 0.6931502233468916
message: '`gtol` termination condition is satisfied.'
      x: array([1.00000063e+00, 8.21427026e-09, 2.40956900e-06])

Note that the function value and x values are the same as in @CoryKramer's answer. The x array may look superficially different at first glance, but both answers round to [1, 0, 0].

JohnE
  • 29,156
  • 8
  • 79
  • 109