1

I'm making a multiple parameters optimization tool for a simulator in python. I have 7 parameters and with changing the parameters, the 5 result items change. (each parameter has different boundary.) I dont know the simulator's eqaution. So, I think I have to initiate random value and iterate an optimization algorithm until finding the parameter values which make the 5 items close to objective values. Could you advice me to the adaptable algorithm? If you give me a sample code, It would be better for me to understand. Thanks in advance.

I tried GA, but It takes too much time and It' couldn't find adaptable value. I think It's because the boundary is too large and many parameters to change.

hakseung lee
  • 21
  • 1
  • 3

1 Answers1

0

There are many libraries in python dedicated to numerical optimization. I would recommend scipy.optimize for more simple tasks like the one you are talking about and pyomo for more complex optimization problems.

Problem type

Let's look at scipy.optimize. First, you need to know whether your optimization problem is convex or non-convex. Convex basically means there is only a single local minimum that we want to find. Non-convex problems can have multiple local minimums where the optimization algorithms can get stuck.

Convex problems

For convex problems, we can simply refer to scipy.optimize.minimize. It requires a function f(x) that we want to minimize, as well as an initial value x0 and (if available) the variable bounds.

A simple example:

from math import inf
import numpy as np
from scipy.optimize import minimize

def func(x):
    simulation_result = sim(x)  # use simulator here
    objective_vector = np.array([1,2,3,4,5])  # Replace this with your objective target vector
    return np.linalg.norm(simulation_result - objective_vector)

res = minimize(func, x0=np.ones(7,1),
               bounds=[(1,2),(10,20),(0,1),(0,1),(0,inf),(-inf,0)])

if res.success:
    print(res.x)

Non-convex problems

This problem class is a lot harder and requires much more advanced algorithms. Luckily scipy.optimize also provides algorithms for this! Check out my answer here and the documentation.

Romeo Valentin
  • 1,276
  • 1
  • 16
  • 22
  • Thank you very much!!. but, I have questions about the code. I don't know the simulation's function. Could I insert an array in sim(x)? like simulation_result=[0.9,1,2,214,381,...]? and what means objective_value=42? my simulation results and objective_value are both arrays. – hakseung lee Jun 17 '19 at 23:46
  • I just assume that the simulation takes some input variables `x` (your seven parameters) and outputs some result (your five result items), which I save in `simulation_result`. You do not need to know how the simulation works, you just need to obtain the output. – Romeo Valentin Jun 18 '19 at 09:46
  • And I actually misread your question regarding the objective value. Let's assume you want to get close to the target vector `[1,2,3,4,5]`, then you just return `np.linalg.norm(simulation_result - np.array([1,2,3,4,5]))`. – Romeo Valentin Jun 18 '19 at 09:47
  • Thanks. but how can I get the optimized parameter value sets? – hakseung lee Jun 27 '19 at 08:38
  • `res.x` should hold the solution – Romeo Valentin Jun 28 '19 at 09:25
  • I just got only [1. 1. 1. 1. 1. 1. 1.]. I think it can't be the solution. – hakseung lee Jul 04 '19 at 04:59