I wish to use scipy's optimize.fmin function to find the minimum of a function, which is a function of both variables I wish to minimize over and parameters which do not change (are not optimized over).
I am able to do this when optimizing over a single variable here:
from scipy import optimize
c1=4
c2=-1
def f(x,c1,c2):
return x**2+c1+c2
guess_f=1
minimum = optimize.fmin(f,guess_f,args=(c1,c2),maxfun=400,maxiter=400,ftol=1e-2,xtol=1e-4)
However, I cannot get this to work when I add another variable to minimize over:
def g(x,y,c1,c2):
return x*y+c1+c2
guess_g=[1,1]
minimum2= optimize.fmin(g,guess_g,args=(c1,c2),maxfun=400,maxiter=400,ftol=1e-2,xtol=1e-4)
I get the following error message:
TypeError: g() missing 1 required positional argument: 'c2'
I did find Multiple variables in SciPy's optimize.minimize, and a solution is presented here in which the variables to be optimized over need to be grouped together as their own array. I try something like this below:
def g(params,c1,c2):
x,y=params
# print(params)
return x*y+c1*x+c2
guess_g=[1,1]
minimum2= optimize.fmin(g,guess_g,args=(c1,c2),maxfun=4000,maxiter=4000,ftol=1e-2,xtol=1e-4)
I do not receive a TypeError, but what I do get is the "Warning: Maximum number of function evaluations has been exceeded." message along with a RuntimeWarning: overflow encountered in double_scalars after removing the cwd from sys.path. (additionally, I tried using the optimize.minimize command to do the same thing, but was unable to get it to work when adding the extra arguments, but I do not post that code here as the question is already getting long).
So this does not seem to be the correct way to do this.
How do I go about optimizing with optimize.fmin function over multiple variables, while also giving my function additional arguments?