0

I want to show the progress of differential evolution and store the objective funtion values as it runs. My MWE is:

def de_optimise():
    def build_show_de(MIN=None):
        if MIN is None:
            MIN = [0]
        def fn(xk, convergence):
            obj_val = opt(xk)
            if  obj_val < MIN[-1]:
                print("DE", [round(x, 2) for x in xk], obj_val)
                MIN.append(opt(xk))
        return fn

    bounds = [(0,1)]*3
    # Define the linear constraints
    A = [[-1, 1, 0], [0, -1, 1]]
    lb = [0.3, 0.4]
    ub = [np.inf, np.inf]
    constraints = LinearConstraint(A, lb, ub)
    progress_f = [0]
    c = build_show_de(progress_f)
    print("Optimizing using differential evolution")

    res = differential_evolution(
        opt, 
        bounds=bounds,
        constraints=constraints,
        callback=c, 
        disp=True
    )
    print(f"external way of keeping track of MINF: {progress_f}")

de_optimise()

It works but in the function fn I have to recompute opt(xk) which must have already been computed. I have to do this as the callback function of differential_evolution is documented as follows:

callback: callable, callback(xk, convergence=val), optional A function to follow the progress of the minimization. xk is the best solution found so far. val represents the fractional value of the population convergence. When val is greater than one the function halts. If callback returns True, then the minimization is halted (any polishing is still carried out).

As this is slow it slows down the optimization a lot. How can I avoid having to do this?

Simd
  • 19,447
  • 42
  • 136
  • 271
  • Does this answer your question? [How to display progress of scipy.optimize function?](https://stackoverflow.com/questions/16739065/how-to-display-progress-of-scipy-optimize-function). Specifically the answer by Henri, though some other answers discuss the issues with `callback`. – jared Jun 20 '23 at 14:35
  • 1
    `opt` is already called for the entire population; is more more call really that costly? – Mikael Öhman Jun 20 '23 at 16:06
  • @MikaelÖhman that's a good point – Simd Jun 20 '23 at 16:37

1 Answers1

1

If I understand correctly you want something like this:

from scipy.optimize import differential_evolution, LinearConstraint, rosen
import numpy as np

class fn:
    def __init__(self):
        self.best_x = None
        self.minf = np.inf
    
    def __call__(self, x):
        f = rosen(x)
        
        if f < self.minf:
            self.minf = f
            self.best_x = x

        return f

    
class callback:
    def __init__(self, FN):
        self.FN = FN
    
    def __call__(self, xk, convergence):
        np.testing.assert_equal(xk, self.FN.best_x)
        print(self.FN.best_x, self.FN.minf)


FN = fn()
C = callback(FN)

res = differential_evolution(FN, [(0, 10)] * 5, callback=C)
print(res)
[1.73780654 1.5180404  2.07430624 1.72280576 5.55451018] 1567.9474614862615
[1.20847265 0.90208029 0.27028852 2.46391859 6.70219186] 674.9123192084
...
[1. 1. 1. 1. 1.] 0.0
 message: Optimization terminated successfully.
 success: True
     fun: 0.0
       x: [ 1.000e+00  1.000e+00  1.000e+00  1.000e+00  1.000e+00]
     nit: 591
    nfev: 44406
Andrew Nelson
  • 460
  • 3
  • 11
  • (There is a typo where it should be res = differential_evolution(FN, [(0, 10)] * 5, callback=C) Your code seem to print new outputs even if the rosen(x) is the same as before. E.g. [1. 1. 1. 1. 1.] 0.0 [1. 1. 1. 1. 1.] 0.0 [1. 1. 1. 1. 1.] 0.0 – Simd Jun 21 '23 at 06:24
  • Fixed the typo. The callback prints at the end of every iteration. If the best solution hasn't changed then you'll see rows repeating. Convergence is reached when the standard deviation of population energies falls below a tolerance. At convergence iteration stops. – Andrew Nelson Jun 21 '23 at 07:42
  • Is it possible to print only on improvement? – Simd Jun 21 '23 at 07:45
  • Just store the lowest value printed so far in the Callback object. Test if minf improves, if it does, print it. – Andrew Nelson Jun 21 '23 at 07:56