I'm using scipy.optimize.minimize
with method='bfgs'
to train a convex objective.
Every time I run a minimization, the first two calls the BFGS optimizer makes to my objective function always have the same parameter vector. This seems unnecessary as it wastes a good few minutes re-calculating the same thing twice.
Minimum Working Example (with a much simpler objective);
from scipy.optimize import minimize
def obj_jac(x):
"""Return objective value and jacobian value wrt. x"""
print(x)
return 10*x**2, 20*x
minimize(obj_jac, -100, method='bfgs', jac=True, tol=1e-7)
Output;
[-100.]
[-100.]
[-98.99]
[-94.95]
[-78.79]
[-30.17904355]
[-3.55271368e-15]
Does anyone know if this is expected behaviour for the BFGS implementation in scipy?
Update: I have submitted this as issue #10385 on the Scipy project.