1

I have the same problem as in this question but don't want to add only one but several constraints to the optimization problem.

So e.g. I want to maximize x1 + 5 * x2 with the constraints that the sum of x1 and x2 is smaller than 5 and x2 is smaller than 3 (needless to say that the actual problem is far more complicated and cannot just thrown into scipy.optimize.minimize as this one; it just serves to illustrate the problem...).

I can to an ugly hack like this:

from scipy.optimize import differential_evolution
import numpy as np

def simple_test(x, more_constraints):

    # check wether all constraints evaluate to True
    if all(map(eval, more_constraints)):

        return -1 * (x[0] + 5 * x[1])

    # if not all constraints evaluate to True, return a positive number
    return 10

bounds = [(0., 5.), (0., 5.)]

additional_constraints = ['x[0] + x[1] <= 5.', 'x[1] <= 3']
result = differential_evolution(simple_test, bounds, args=(additional_constraints, ), tol=1e-6)
print(result.x, result.fun, sum(result.x))

This will print

[ 1.99999986  3.        ] -16.9999998396 4.99999985882

as one would expect.

Is there a better/ more straightforward way to add several constraints than using the rather 'dangerous' eval?

Cleb
  • 25,102
  • 20
  • 116
  • 151
  • 2
    You don't need eval. Just follow the approach used in scipy.optimize.minimize examples. Create function or lambda-functions and evaluate all in your test. When called from simple_test, you already got access to x, which you can then pass. (more important would be the underlying theory of DE, if this approach is actually a good one) – sascha Nov 18 '17 at 18:19
  • @sascha: Would you mind adding an answer with some code below? More than happy to upvote it if it is useful. – Cleb Nov 18 '17 at 18:23
  • I see no need and i'm still not fully convinced the general concept is working out, but i'm a bit biased in regards to all these global-opt heuristics. I'm just saying, you create lambdas or functions like [here](https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.optimize.minimize.html#scipy.optimize.minimize). Of course in your case you would call these from your ```simple_test```. The only gain though is the eval-free usage. Example: ```check_0 = lambda x: x[0] + x[1] <= 5```. – sascha Nov 18 '17 at 18:25
  • @sascha: Thanks. I tried scipy's `minimize` but it appeared quite sensitive regarding initial conditions for my system; `differential_evolution`, however, worked fine but there is no built-in for passing additional constraints as far as I can see. Is the answer below what you had in mind as well? – Cleb Nov 18 '17 at 18:32
  • I did not recommend minimize. It's a different thing for sure. The answer is basically what i wanted. Not sure if that helps much, but it looks a bit more nice. In general you should be careful about the underlying method and how it behaves with this. It's probably non-trivial. Some approach of mixing DE and minimize (some usage is supported with arguments) probably would be better but i don't have something specific in mind. Compare with [this](https://stackoverflow.com/questions/47055970/how-can-i-write-bounds-of-parameters-by-using-basinhopping/47058263#47058263) which is a bit related. – sascha Nov 18 '17 at 18:36
  • The above example i linked is only there to show that there are sometimes hidden things in those optimizers and when there is no clearly dedicated support for bounds and constraints, bad stuff can happen. I'm not saying it will too for DE. But do your own observations. – sascha Nov 18 '17 at 18:37

2 Answers2

2

An example is something like this::

additional_constraints = [lambda(x): x[0] + x[1] <= 5., lambda(x):x[1] <= 3]

def simple_test(x, more_constraints):

    # check wether all constraints evaluate to True
    if all(constraint(x) for constraint in more_constraints):

        return -1 * (x[0] + 5 * x[1])

    # if not all constraints evaluate to True, return a positive number
    return 10
Ashalynd
  • 12,363
  • 2
  • 34
  • 37
1

There is a proper solution to the problem described in the question, to enforce multiple nonlinear constraints with scipy.optimize.differential_evolution.

The proper way is by using the scipy.optimize.NonlinearConstraint function.

Here below I give a non-trivial example of optimizing the classic Rosenbrock function inside a region defined by the intersection of two circles.

import numpy as np
from scipy import optimize

# Rosenbrock function
def fun(x):
    return 100*(x[1] - x[0]**2)**2 + (1 - x[0])**2

# Function defining the nonlinear constraints:
# 1) x^2 + (y - 3)^2 < 4
# 2) (x - 1)^2 + (y + 1)^2 < 13
def constr_fun(x):
    r1 = x[0]**2 + (x[1] - 3)**2
    r2 = (x[0] - 1)**2 + (x[1] + 1)**2
    return r1, r2

# No lower limit on constr_fun
lb = [-np.inf, -np.inf]

# Upper limit on constr_fun
ub = [4, 13]

# Bounds are irrelevant for this problem, but are needed
# for differential_evolution to compute the starting points
bounds = [[-2.2, 1.5], [-0.5, 2.2]]

nlc = optimize.NonlinearConstraint(constr_fun, lb, ub)
sol = optimize.differential_evolution(fun, bounds, constraints=nlc)

# Accurate solution by Mathematica
true = [1.174907377273171, 1.381484428610871]
print(f"nfev = {sol.nfev}")
print(f"x = {sol.x}")
print(f"err = {sol.x - true}\n")

This prints the following with default parameters:

nfev = 636
x = [1.17490808 1.38148613]
err = [7.06260962e-07 1.70116282e-06]

Here is a visualization of the function (contours) and the feasible region defined by the nonlinear constraints (shading inside the green line). The constrained global minimum is indicated by the yellow dot, while the magenta one shows the unconstrained global minimum.

This constrained problem has an obvious local minimum at (x, y) ~ (-1.2, 1.4) on the boundary of the feasible region which will make local optimizers fail to converge to the global minimum for many starting locations. However, differential_evolution consistently finds the global minimum as expected.

enter image description here

divenex
  • 15,176
  • 9
  • 55
  • 55
  • 1
    Thanks, and excellent timing as I might soon need this again :) I will take a look over the weekend and then get back to it with an upvote etc. – Cleb Mar 04 '22 at 08:42
  • When I run the code as it is, I get consistently different solutions `nfev = 30856 x = [-0.77427633 0.6126383 ], err = [-1.84217538 -0.65136404]` and `sol.message` tells me that it was a successful optimization. However, when I check `fun(sol.x)` I obtain `3.165` while for `fun(true)` I get `1.53`, so `true` is indeed better. Which `scipy` version are you using? – Cleb Mar 04 '22 at 23:01
  • Seems it works well for my toy example (upvoted). Will play a bit more and then probably accept soon as that solution is indeed quite nice. – Cleb Mar 04 '22 at 23:36
  • Thanks Cleb, my bad. I had a typo in my example. I replaced it with a correct and less difficult one, where `differential_evolution` converges with default parameters. – divenex Mar 05 '22 at 16:24
  • @Cleb, did this solution work for you after my fix? – divenex Apr 05 '22 at 08:03
  • 1
    It did, but I forgot to accept :) – Cleb Apr 05 '22 at 14:12