20

SciPy provides two functions for nonlinear least squares problems:

optimize.leastsq() uses the Levenberg-Marquardt algorithm only.

optimize.least_squares() allows us to choose the Levenberg-Marquardt, Trust Region Reflective, or Trust Region Dogleg algorithm.

Should we always use least_squares() instead of leastsq()?

If so, what purpose does the latter serve?

visitor
  • 672
  • 6
  • 17

1 Answers1

22

Short answer

Should we always use least_squares() instead of leastsq()?

Yes.

If so, what purpose does the latter serve?

Backward compatibility.

Explanation

The least_squares function is new in 0.17.1. Its documentation refers to leastsq as

A legacy wrapper for the MINPACK implementation of the Levenberg-Marquadt algorithm.

The original commit introducing least_squares actually called leastsq when the method was chosen to be 'lm'. But the contributor (Nikolay Mayorov) then decided that

least_squares might feel more solid and homogeneous if I write a new wrapper to MINPACK functions, instead of calling leastsq.

and so he did. So, leastsq is no longer required by least_squares, but I'd expect it to be kept at least for a while, to avoid breaking old code.

  • 1
    In my experiments it turned out that `leastsq` is some 10-15% faster than `least_squares`. Can you comment on this? – Igor F. Nov 29 '20 at 13:56