0

I have implemented an algorithm that is able to fit multiple data sets at the same time. It is based on this solution: multi fit

The target function is too complex to show here (LaFortune scatter model), so I will use the target function from the solution for explanation:

def lor_func(x,c,par):
    a,b,d=par
    return a/((x-c)**2+b**2)

How can I punish the fitting algorithm if it chooses a parameter set par that results in lor_func < 0.

A negative value for the target function is valid from a mathematical point of view. So the parameter set par resulting in this negative target function might be the solution with the least error. But I want to exlude such solutions as they are nor physically valid.

A function like:

def lor_func(x,c,par):
    a,b,d=par
    value = a/((x-c)**2+b**
    return max(0, value)

does not work as the fit returns wrong data as it optimizes the 0-values too. The result will then be different from the correct one.

Community
  • 1
  • 1
RaJa
  • 1,471
  • 13
  • 17

1 Answers1

1

use the bounds argument of scipy.optimize.least_squares?

res = least_squares(func, x_guess, args=(Gd, K),
                    bounds=([0.0, -100, 0, 0],
                            [1.0, 0.0, 10, 1]),
                             max_nfev=100000, verbose=1)

like I did here: Suggestions for fitting noisy exponentials with scipy curve_fit?

Community
  • 1
  • 1
f5r5e5d
  • 3,656
  • 3
  • 14
  • 18
  • Bounds are not an option as I do not know them beforehand. It is a problem of the targe function which has a term `power(a, b)`. Sometimes `a<0` which does then result in `nan` if `b` is a broken number. This I have solved by returning a negative number when 'nan' occurs. So I have to prevent him from finding solutions where the target function becomes negative. – RaJa Mar 03 '17 at 06:46
  • It seems that this approach is the only working one. It requires some fiddling with the bound but works. – RaJa Mar 06 '17 at 09:27