1

I am looking for the correct approach to use a variable number of parameters as input for the optimizer in scipy.

I have a set of input parameters p1,...,pn and I calculate a quality criteria with a function func(p1,...,pn). I want to minimize this value.

The input parameters are either 0 or 1 indicating they should be used or not. I cannot simply delete all unused ones from the parameter list, since my function for the quality criteria requires them to be "0" to remove unused terms from equations.

def func(parameters):
    ...calculate one scalar as quality criteria...

solution = optimize.fmin_l_bfgs_b(func,parameters,approx_grad=1,bounds=((0.0, 5.0),...,(0.0,5.0)) # This will vary all parameters

Within my code the optimizer runs without errors, but of course all given parameters are changed to achieve the best solution.

Is there a way to have e.g. 10 input parameters for func, but only 5 of them are used in the optimizer?

So far I can only think of changing my func definition in a way that I will not need the "0" input from unused parameters. I would appreciate any ideas how to avoid that.

Thanks a lot for the help!

sheepie
  • 13
  • 3
  • 1
    I am not sure to understand what you're trying to do, but have a look at the [**kwargs](http://stackoverflow.com/questions/1769403/understanding-kwargs-in-python) – Paco Sep 11 '13 at 15:52
  • Thanks for pointing it out, it did not quite solve the problem, but was good to know anyway. – sheepie Sep 13 '13 at 12:31

2 Answers2

1

If I understand correctly, you are asking for a constrained best fit, such that rather than finding the best [p0,p1,p2...p10] for function func(), you want to find the best best [p0, p1, ...p5] for function func() under a condition that p6=fixed6, p7=fixed7, p8=fixed8... and so on.

Translate it into python code is straight forward if you use args=(somthing) in scipy.optimize.fmin_l_bfgs_b. Firstly, write a partially fixed function func_fixed()

def func_fixed(p_var, p_fixed):
    return func(p_var+p_fixed) 
# this will only work if both of them are lists. If they are numpy arrays, use hstack, append or similar

solution = optimize.fmin_l_bfgs_b(func_fixed,x0=guess_parameters,\
                                  approx_grad=your_grad,\
                                  bounds=your_bounds,\
                                  args=(your_fixed_parameters), \ #this is the deal
                                  other_things)

It is not necessary to have func_fixed(), you can use lambda. But it reads much easier this way.

CT Zhu
  • 52,648
  • 17
  • 120
  • 133
0

I recently solved a similar problem where I want to optimise a different subset of parameters at each run but need all parameters to calculate the objective function. I added two arguments to my objective function:

  1. an index array x_idx which indicates which parameters to optimise, i.e. 0 don't optimise and 1 optimise
  2. an array x0 with the initial values of all parameters

In the objective function I set the list of the parameters according to the index array either to the parameters which are to be optimised or the initial values.

import numpy 
import scipy.optimize

def objective_function(x_optimised, x_idx, x0):
    x = []
    j = 0
    for i, idx in enumerate(x_idx):
        if idx is 1:
            x.append(x_optimised[j])
            j = j + 1
        else:
            x.append(x0[i])
    x = numpy.array(x)
    return sum(x**2)

if __name__ == '__main__':
    x_idx = [1, 1, 0]
    x0 = [1.1, 1.3, 1.5]

    x_initial = [x for i, x in enumerate(x0) if x_idx[i] is 1]
    xopt, fopt, iter, funcalls, warnflag = scipy.optimize.fmin(objective_function, \
                                                   x_initial, args=(x_idx, x0,), \
                                                   maxfun = 200, full_output=True)
    print xopt
Daniel
  • 21
  • 5