0

I have a question similar to python nonlinear least squares fitting except I want to optimize a vector AND some free parameters.

I am used to the scipy.optimize.curve_fit wrapper function but do not have a functional form for the vector I'm optimizing. I need to minimize the sum of the squares of this residual (using latex notation, sorry it's not super readable):

\sum_n [f_n - \sum_N (a_N + b_N + DATA(n,N))]

Right now, my code just optimizes over f_n:

def residuals(fn, a, b, DATA):
    return fn - function(a, b, DATA)

def function(aN, bN, DATA):
    stuff = np.zeros(max_n)
    for N in range(len(DATA)):
            stuff += aN[N] + bN[N]*np.array(range(max_n)) + DATA[N]
    return stuff

bestfit_f_n = sp.optimize.leastsq(residuals,INITIAL_GUESS_FOR_f_n,args=(aN,bN, DATA), full_output=1)

but I want to have a_N and b_N be free parameters that are to be optimized over and returned as well.

If there's a better function to use, I'd be glad to hear. Right now, this could be linearly optimized but I might be working with very large data sets in the future and would prefer a non-linear optimization.

  • 1. There's a bug in the function, `stuff` gets overwritten `len(DATA)` times. 2. You probably want to optimize the sum of abs values or sum of squares of some such. – ev-br Jun 05 '18 at 20:41
  • 1. Thank you, you're right. 2. The scipy optimize leastsq function takes the residual as the input and minimizes the sum of the squares itself. – Mich Kelley Jun 05 '18 at 21:33

0 Answers0