1

Making my question a bit clearer:

I have determined a minimised chi-squared value, let's call it def chisq(), for some data given a fit involving 5 parameters: a,b,c,d and e.

I would like to know of a function, or a way, where I can minimise my chi-squared function keeping 1 parameter constant whilst varying the others so as to tend to a minimum value... AND

A way in which I can keep 4 of the 5 parameters constant whilst minimising to a specified value (the value will be chisq() + 1, so as to tend to the extreme of a chi squared contour plot, but that's just some added context).

So I'm stuck... Anyway, I hope that's more coherent than my last attempt

Update: This is where I'm at now. I borrowed the main code from another page and I'm trying to adapt it to my needs. And I never intended of doing confidence area plots but they look very nice.

However, I'm getting the error message " 'numpy.ndarray' object is not callable ". I've looked into this and I think it's something to do with me referncing an array as though it were a function. This leads me to the minimiser / mini section, which I think is where the problem lies.

So any help on how I can correct and adapt this code would be much appreciated. Thank you so far @mikuszefski

import numpy as np
import matplotlib.pyplot as plt
import lmfit


# Data

xval1 = np.array([0.11914471, 0.230193321, 0.346926427, 0.460501481, 0.575512013, 0.689201904, 0.804958885, 0.918017166, 1.034635434, 1.149990481, 1.266264234]) 
xval2 = np.array([0.116503429, 0.230078483, 0.348247067, 0.461420187, 0.577751359, 0.690120611, 0.80398276, 0.918878453, 1.033716728, 1.150794349, 1.266149395]) 
xval3 = np.array([0.115240208, 0.230710093, 0.346007721, 0.458032458, 0.576028785, 0.692359957, 0.80725565, 0.920888123, 1.035037368, 1.147521458, 1.267871969])
xval_av = (xval1 + xval2 + xval3 )/3

yval1 = np.array([2173.446136, 2400.969972, 2547.130603, 2658.022565, 2723.098769, 2760.481961, 2761.881166, 2734.638209, 2671.839502, 2559.251885, 2313.774753])
yval2 = np.array([2185.207051, 2441.547714, 2587.120886, 2701.697704, 2765.897692, 2792.190609, 2793.030024, 2761.191402, 2697.078931, 2583.572776, 2329.860358])
yval3 = np.array([2178.269249, 2438.81575, 2601.305985, 2704.228832, 2765.493933, 2802.801105, 2796.873214, 2760.426641, 2690.92674, 2575.961498, 2330.099396])
yval_av = (yval1 + yval2 + yval3 )/3

yerr1 = np.array([28.0041333, 17.22884875, 11.77504404, 12.0784658, 11.43116392, 10.17734322, 9.839420192, 7.879393332, 8.586145049, 8.129879332, 6.543378761])
yerr2 = np.array([28.84371257, 22.30391049, 16.3503791, 12.32503283, 10.6931908, 8.909895306, 9.310031644, 9.619524491, 7.725528299, 6.64584248, 5.483917187])
yerr3 = np.array([25.22103486, 17.58553765, 16.56186149, 14.93545788, 9.884470694, 11.15591573, 8.696907113, 9.602409632, 8.156617355, 6.966615987, 5.706426531])
yerr_av = (yerr1 + yerr2 + yerr3 )/3

# Parameters

a_original = 1772.73301389 
b_original = 4137.16888269 
c_original = -6903.58620042 
d_original = 5845.15206084 

#-----------------------------------------------------------------------------------------

x = xval_av
y = yval_av

p = lmfit.Parameters()
p.add_many(('a', a_original), ('b', b_original), ('c', c_original), ('d', d_original))

def residual(x,y,p):
    return p['a'] + p['b']*x + p['c']*x**2 + p['d']*x**3 - y 

# Minimiser
mini = lmfit.Minimizer(residual(x,y,p), p, nan_policy='omit')

# Nelder-Mead
out1 = mini.minimize(method='Nelder')

# Levenberg-Marquardt
# Nelder-Mead as initial guess
out2 = mini.minimize(method='leastsq', params=out1.params)

lmfit.report_fit(out2.params, min_correl=0.5)

ci, trace = lmfit.conf_interval(mini, out2, sigmas=[1, 2],
                                trace=True, verbose=False)
lmfit.printfuncs.report_ci(ci)

plot_type = 2

if plot_type == 0:
    plt.plot(x, y)
    plt.plot(x, residual(out2.params) + y)

elif plot_type == 1:
    cx, cy, grid = lmfit.conf_interval2d(mini, out2, 'b', 'd' )
    plt.contourf(cx, cy, grid, np.linspace(0, 1, 11))
    plt.xlabel('b')
    plt.colorbar()
    plt.ylabel('d')

elif plot_type == 2:
    cx, cy, grid = lmfit.conf_interval2d(mini, out2, 'a', 'd', 30, 30)
    plt.contourf(cx, cy, grid, np.linspace(0, 1, 11))
    plt.xlabel('a')
    plt.colorbar()
    plt.ylabel('d')

elif plot_type == 3:
    cx1, cy1, prob = trace['a']['a'], trace['a']['d'], trace['a']['prob']
    cx2, cy2, prob2 = trace['d']['d'], trace['d']['a'], trace['d']['prob']
    plt.scatter(cx1, cy1, c=prob, s=30)
    plt.scatter(cx2, cy2, c=prob2, s=30)
    plt.gca().set_xlim((2.5, 3.5))
    plt.gca().set_ylim((11, 13))
    plt.xlabel('a')
    plt.ylabel('d')

if plot_type > 0:
    plt.show()
CricK0es
  • 369
  • 1
  • 3
  • 6
  • Please post the error messages you are getting. – berkelem Nov 19 '18 at 15:16
  • This is the thing, it's not the errors that I'm concerned about, I can troubleshoot where I've mucked up array indexing and such on my own. I just don't know how I can say... minimise my chi squared function to a value of ( chi_min + 1 ) by varying only one of the parameters/arguments (this will be the one I''m trying to find the error on), then minimise chi completely by varying all the other parameters whilst keeping the previous one constant. So an iterative process where I can find the edge of a Chi + 1 contour and relate this back to my original chi, to find the error on that parameter... – CricK0es Nov 20 '18 at 11:46
  • So I had a look at minimise_scalar (lambda) or optimise_scalar (lambda,), something like that but it seems that I could only vary one of my parameters. I really just have no idea what functions to use or how I could set it up or if there's an easier way using bootstrap or curve optimisation functions... I'm just completely stuck on how I'd get started on such a function – CricK0es Nov 20 '18 at 11:50
  • Does this question help? https://stackoverflow.com/questions/13670333/multiple-variables-in-scipys-optimize-minimize – berkelem Nov 20 '18 at 12:25
  • Can you rephrase your post such that is clear what you actually want? What do you mean by `chi_min + 1`? Is it clear that if only parameter, e.g., `a` is optimized you can actually get down to that value? – mikuszefski Nov 22 '18 at 13:16
  • I think the `lmfit` package allows you to fix parameters. – mikuszefski Nov 23 '18 at 09:53
  • I've installed the lmfit package and had a play and looked through some examples and I've got something like this. It's not my original plan but it seems more eloquent this way. I've updated my post – CricK0es Nov 23 '18 at 15:55
  • The ncertainties estimated with `lmfit.minimize()` using the `leastsq` method will do what you seem to want -- find uncertainties as values for parameter A that find the best *other* parameters to increase chi_square by 1. These (and the associated correlations) are automatically determined as part of the fit process (and so, very fast). The confidence intervals available from `lmfit.conf_interval()` are calculated more explicitly (by brute-force). These uncertainties can be more accurate and can be non-symmetric, but the automatic values are surprisingly good for many real problems. – M Newville Nov 24 '18 at 01:52
  • That's great. I've got it all working now and I've got some lovely graphs and results from it. Thank you everyone for your help ;) – CricK0es Nov 24 '18 at 18:42

0 Answers0