0

I have a set of x,y points in which x lies between 0 and 100e-12 while y ranges between 0 and 1e9 and want to use scipy.curve_fit to fit a model. It fails because of the numbers. If I rescale the data so that the numbers are closer to 1 (i.e. x*=1e10 and y*=1e-9), then it works. So I know where the problem is and could eventually live with this solution. But would prefer to perform the fit in the original scale.

Is it possible?

I have seen an answer where it is suggested to use a diag argument but with this I get: least_squares() got an unexpected keyword argument 'diag'. I guess it was from an older version. Is there an analogous for the current version?

Additional info: I am providing curve_fit with very reasonable p0.

user171780
  • 2,243
  • 1
  • 20
  • 43

2 Answers2

0

It IS possible, but in ML it is preferable to rescale when features have different ranges. Also it helps you to increase overall performance and accuracy

If you don't live other methods, you can try Log normalization. Usually it helps to solve such problems

For example here is how you do it

x_norm = np.log(x)
0

curve_fit is a wrapper of a wrapper. It calls either least_squares or leastsq depending on the method chosen with the method keyword argument. Default should be the latter, which permits the argument diag. If bounds are set, the first is chosen by default, which has no diag as kwarg.

Personally, I would always chose to scale.

mikuszefski
  • 3,943
  • 1
  • 25
  • 38