Given the lack of answers and comments to this question, at first I thought it was impossible to accomplish. So I opened an enhancement issue on Scipy's GitHub page and asked whether about adding this functionality.
One of the contributors closed my issue saying that, despite in a tricky manner (and not very elegant, in my opinion) this can be done and he gave me some hints.
This is the solution I've come up with, and I hope this helps
from scipy.optimize import differential_evolution
from scipy.optimize import rosen
import numpy
class MinimizeStopper(object):
def __init__(self, f=rosen, tau=1):
self.fun = f # set the objective function
self.best_x = None
self.best_func = numpy.inf
self.tau = tau # set the user-desired threshold
def __call__(self, xk, convergence=None, *args, **kwds):
fval = self.fun(xk, *args, **kwds)
if fval < self.best_func:
self.best_func = fval
self.best_x = xk
if self.best_func <= self.tau:
print("Terminating optimization: objective function threshold triggered")
print(self.best_x)
return True
else:
return False
bounds = [(0,2), (0, 2), (0, 2), (0, 2), (0, 2)]
result = differential_evolution(rosen, bounds, callback=MinimizeStopper(), polish=False,disp=True, maxiter=100, popsize=100)
print(result)
which returns
differential_evolution step 1: f(x)= 10.7709
differential_evolution step 2: f(x)= 10.7709
differential_evolution step 3: f(x)= 8.02332
differential_evolution step 4: f(x)= 2.16592
differential_evolution step 5: f(x)= 2.16592
differential_evolution step 6: f(x)= 2.16592
differential_evolution step 7: f(x)= 0.812177
Terminating optimization: objective function threshold triggered
[1.01141374 0.95894166 0.91957732 0.87022813 0.70102066]
fun: 0.8121773465012827
message: 'callback function requested stop early by returning True'
nfev: 4000
nit: 7
success: False
x: array([1.01141374, 0.95894166, 0.91957732, 0.87022813, 0.70102066])
A few notes:
- the solution is inelegant because it requires an additional evaluation of the fitness function in order to check for the stop criterion. Unfortunately there is no workaround because of the inner structure of the
scipy.optimize
module
- I have tested this approach on
rosen
and generally it works if the objective function does not need any additional parameters, but if the objective function needs additional parameters, then I think one must play around with *args
- Printing
self.best_x
in the second if
branch is not mandatory, of course. It was just a debugging check I added in order to see whether the best solution found by the callback
is actually returned in result
(i.e., the overall best solution found by differential_evolution()
)