2

I m using a least square minimization in order to fit a lot of parameters, but, the results are little surprising, i think it could be due to minimization. Indeed, when i modify the initialization terms, the results are different...

In a first time i try to adjust the initial values of parameters with the command "xtol" which specify the error desired in the approximate solution, but the results are not better, so i would like to know if it is possible to specify bounds for each parameters. This last solution could be more accurate couldn it?

Moreover terms i need to fit are parameters of a series perhaps there is a best way to deal this problem than i have done by writing the 2 orders...

def residual_V2will2(vars, XA, YA, x0, y0, A2, donnees):
    aI = vars[0]
    aII = vars[1]
    modeleV2 = ma.masked_invalid(np.sqrt((XA-x0)**2 + (YA-y0)**2)**(1/2)*aI/(2*G)*((Kappa - 1/2. + \
    1)*np.sin(np.arctan(((YA-y0)/(XA-x0)))/2) + \
    1/2.*np.sin(np.arctan(((YA-y0)/(XA-x0)))*(-3/2.))) + np.sqrt((XA-x0)**2 + \
    (YA-y0)**2)*aII/(2*G)*((Kappa - 2)*np.sin(np.arctan(((YA-y0)/(XA-x0)))) + \
    np.sin(np.arctan(((YA-y0)/(XA-x0)))*(-1))) + A2) 
    return (donnees-modeleV2)

from scipy.optimize import leastsq
vars = [KI, A2, x0, y0]
out_V_west = sco.leastsq(residual_V2west, vars, args=(XAvect, YAvect, Vmvect))
print out_V_west

So if i follow you i must have something like that :

def residual_V2west(vars, XA, YA, donnees):
    KI = vars[0]
    A2 = vars[1]
    x0 = vars[2]
    y0 = vars[3]
    modeleV2 =  ...
    penalization = abs(2.-modeleV2)*10000
    return (donnees-modeleV2 - penalization)

But dont seem better :( although i try to play with values of penalization...

BartoszKP
  • 34,786
  • 15
  • 102
  • 130
user3601754
  • 3,792
  • 11
  • 43
  • 77
  • 1
    [this answer tells](http://stackoverflow.com/a/16632712/832621) how to include penalizations in the residual function... you can do the same for your problem in case the values go outside the boundaries... – Saullo G. P. Castro Sep 28 '14 at 19:31
  • I try to limit the numbers of parameters to fit, i can have an idea of x0 et y0, but with only 2 parameters i have the same problem...perhaps my model isn't good... :/ – user3601754 Sep 29 '14 at 18:28
  • I believe the number of parameters is not the problem... perhaps the penalization should be included like `(donnees - modeleU2 + penalization)` – Saullo G. P. Castro Sep 29 '14 at 20:36
  • "Indeed, when i modify the initialization terms, the results are different...": this suggests that you end up in various local minima. With quite a few parameters, fitting can become complicated, and the fit can fairly easily get stranded inside a local minimum. I think the complicated function doesn't help. It may help if all your parameters (`var`, `XAvect`, `YAvect`, `Vmvect`) are reasonably normalized, i.e., order ~1. –  Sep 30 '14 at 13:53
  • Perphas it could be that, but my function is an expansion series, perhaps i could avoid this problem as i m in a special case? It s the Williams series wich describe displacements near a crack tip... – user3601754 Sep 30 '14 at 13:58
  • If it could be, you should investigate. What is the rough order of your input data? Could you give some numbers, or perhaps you have a figure with the data? –  Sep 30 '14 at 14:19
  • 1
    I've had good experiences with [lmfit](http://lmfit.github.io/lmfit-py/), which allows bounding variables and a great deal more. I would recommend not using penalty functions but rather [doing a transform](http://stats.stackexchange.com/questions/1112/how-to-represent-an-unbounded-variable-as-number-between-0-and-1) for enforcing bounds which result in non-physical values of your function – chthonicdaemon Sep 30 '14 at 14:33

0 Answers0