I am using scipy.optimize.minimize()
to minimise a certain function. I want to compare the performance of different methods, BFGS
and L-BFGS-B
, and for that, I would like the function to print out its values and error margins as it is optimising.
The L-BFGS-B
does this automatically in fact, and it looks like the following:
At X0 0 variables are exactly at the bounds
At iterate 0 f= 7.73701D+04 |proj g|= 1.61422D+03
At iterate 1 f= 4.33415D+04 |proj g|= 1.16289D+03
At iterate 2 f= 9.97661D+03 |proj g|= 5.04925D+02
At iterate 3 f= 4.10666D+03 |proj g|= 3.04707D+02
....
At iterate 194 f= 3.34407D+00 |proj g|= 3.55117D-04
At iterate 195 f= 3.34407D+00 |proj g|= 3.36692D-04
At iterate 196 f= 3.34407D+00 |proj g|= 9.58307D-04
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
243 196 205 1 0 0 9.583D-04 3.344D+00
F = 3.34407234824719
Does anyone know how I can do the same for BFGS
?
Note: This question is related to a larger question posted here: SciPy optimisation: Newton-CG vs BFGS vs L-BFGS, on the divergence between the behaviour between these two algorithms in a particular optimisation problem. I want to track down where these two algorithms are diverging.