0

I'm trying to optimize my bounds in a "scipy" optimization. Some variables in my "x" vector are extremely restricted, and I cannot optimize my objective function using another values.

For example, x1 just can take the following values: 19, 20, 24, 40. Scipy ask to provide x1 bounds like (19, 40), but the function takes the tolerance steps as they want.

There is a way to optimize with scipy developing a constrained step for some variables??

bounds = ((19, 40),...)
x0 = (300, 19, ...)
resultado = optimize.minimize(fun, x0, bounds = bounds, method = 'TNC',
                              constraints = cons)
Vaibhav Vishal
  • 6,576
  • 7
  • 27
  • 48
srgam
  • 366
  • 1
  • 13
  • Perhaps check this: https://stackoverflow.com/questions/39236863/restrict-scipy-optimize-minimize-to-integer-values – Ardweaden Aug 16 '19 at 11:40
  • 1
    In general: no. The framework of the algorithm does not allow that. Your restrictions can be seen as enforcing integrality and more: defining a discontinuous set. Even the former, in the linear case, renders a problem NP-hard. When being linear, this is mixed-integer programming (and there are solvers available). When being nonlinear, there are additional categories, for example MINLP. Then convexity plays a role and some solvers to check out then could be Bonmin (only convex) or Couenne. MINLP's, especially in the non-convex setting are very hard though. – sascha Aug 16 '19 at 12:34
  • Can I solve a MINLP problem with scipy.optimize? – srgam Aug 20 '19 at 10:30

1 Answers1

0

You can try hard coding these values. By this I mean to say that you can return a very high value when x1 does not belong to the list you provided. But that will be very inefficient operation.

Also another option would be to use Basin-Hopping algorithm in which you can define your own step values and then optimise if needed.

If possible please specify whether by x1 you mean to say second guess or the second element of the 'x' vector.