0

I'm trying to optimize a function obtained with machine learning based on polynomial regression, so I don't have any analytical relationship. This function has 17 input parameters/independent variables (it is geometric parameters), and these parameters are limited by specifically values. The lists with these values are shown below:

minimum_values = [4.03, 15.03, 15.06, 20.02, 90.03, 75.0, 12.01, 12.03, 23.04, 24.01, 21.0, 35.09, 24.01, 21.08, 18.03, 30.04, 66.01]

maximum_values = [11.98, 21.99, 22.99, 29.99, 99.99, 83.0, 21.98, 21.96, 33.0, 29.98, 26.98, 42.94, 30.0, 26.99, 25.92, 42.76, 81.95]

All articles and guides show only simplest analytical mathematic function with one or two independent variables. I was trying using some methods, but they all require only scalar value, but I have to use vector. I see that I should use "constraint" or "bounds" from scipy.optimize, but I don't understad me to write in constraint. So far, I've just written down:

x_initial = [10, 20, 20, 25, 95, 80, 20, 20, 30, 27, 25, 40, 25, 25, 20, 35, 70]

max_efficient = scipy.optimize.fmin(lambda x: -polynom_regression(data, x), x_initial, callback=cb, retall=True)

And it works but, of course, it is an infinite optimization process because algorithm is searching extremum in the infinite space.

I would be grateful for any help!

  • Polynomial regression itself is a minimization task, so are you stacking optimization tasks here? Check out the package `lm_fit`, it allows to set boundaries. Otherwise you could use the usual barrier or penalty embeddings to force the parameters closer to the admissible set. – Lutz Lehmann Dec 28 '22 at 11:23
  • See https://stackoverflow.com/questions/16760788/python-curve-fit-library-that-allows-me-to-assign-bounds-to-parameters for some examples and solution variants. – Lutz Lehmann Dec 28 '22 at 11:33
  • @LutzLehmann I don't quite know what you mean. A polynomial regression is some approximating function based on known values, i don't agree that it is a minimization task. I have trained a model with a polynomial based on some database and I just want to find an extremum of this function during optimization. For example if I build a polynomial regression of degree 2 based on x = [-2, -1, -0.5, 0.5, 1, 2] and y = [4, 1, 0.25, 0.25, 1, 2] then I can find an extremum in x=0 with some accuracy during optimization. – Iaroslav_Che Dec 28 '22 at 11:52
  • It is still not clear what you want. You have a function model depending on 17 parameters. For each parameter you have an admissible interval. For each admissible tuple of parameters you get a function that, for some structural reason, has a maximum or other extremum. Now you want something with this maximum. Compute it for some fixed tuple of parameters, find its range over all admissible parameter tuples, find some optimal parameter tuple where the optimality property is connected to the maximum,...? – Lutz Lehmann Dec 28 '22 at 12:05
  • @LutzLehmann I can describe the task in more details. In general, you got it right. The 17 parameters are geometric characteristics, the target function (which I obtained during training based on a polynomial) is the efficiency of the power machine. Accordingly, these 17 parameters can be combined in different ways, but each of them has a range of variation. All these different combinations of parameters form a certain set, in which there is such a combination, which corresponds to the extremum (specifically maximum) of efficiency (local and global). – Iaroslav_Che Dec 28 '22 at 12:29

1 Answers1

0

I was able to solve this problem only only one method. This code helped me: Does scipy's minimize function with method "COBYLA" accept bounds? . Below the code:

minimum_values = [4.03, 15.03, 15.06, 20.02, 90.03, 75.0, 12.01, 12.03, 23.04, 24.01, 21.0, 35.09, 24.01, 21.08, 18.03, 30.04, 66.01]
maximum_values = [11.98, 21.99, 22.99, 29.99, 99.99, 83.0, 21.98, 21.96, 33.0, 29.98, 26.98, 42.94, 30.0, 26.99, 25.92, 42.76, 81.95]
x_initial = [10, 20, 20, 25, 95, 80, 20, 20, 30, 27, 25, 40, 25, 25, 20, 35, 70]

bounds = []
for i in range(len(minimum_values)):
    bounds.append([minimum_values[i], maximum_values[i]])

cons = []
for factor in range(len(bounds)):
  min_value, max_value = bounds[factor]
  minimum = {'type': 'ineq',
               'fun': lambda x, lb=min_value, i=factor: x[i] - lb}
  maximum = {'type': 'ineq',
               'fun': lambda x, ub=max_value, i=factor: ub - x[i]}
  cons.append(minimum)
  cons.append(maximum)

 res = scipy.optimize.minimize(lambda x: -polynom_regression(data, x), x_initial, constraints=cons, method='COBYLA')

But so far I was able to solve this problem only by method 'COBYLA', many other methods don't work with constrains.