0

I want to use multi objective evolutionary algorithm to solve a single objective optimization problem. Is it technically correct and how it can be done?

sukhalid
  • 43
  • 9

2 Answers2

0

Multi objective algorithms typically work by comparing fitness scores. But when the fitness scores are multi-objective, how can you compare them? Hence, the magic is often in how the algorithm sorts fitness scores. With a single-objective, this magic may be lost when trying to optimize using a multi-objective algorithm, but all the same, it will probably still work quite well.

Since you're only optimizing a single objective, you can get away with utilizing genetic algorithms at their simplest. For an example of that, refer to my answer over at AI algorithm for multi dimension solution optimization / prediction.

Joe
  • 233
  • 1
  • 5
  • Joe, thanks for the answer. I actually required to optimize a profit function, so I split it into two sub-function of income and expense. I am using SPEA2 for optimization. – sukhalid Jan 19 '18 at 20:39
0

Let me add something to the previous answer. Multi-objective optimization is a generalization of single-objective optimization. This implies that single-objective optimization is a subset of it. The research field in multi-objective optimization addresses the difficulty of having more than one value, which implies not a scalar but a vector in the objective space to be used for performance evaluation. Every multi-objective optimization algorithm has to address this in order to work.

For single-objective optimization the domination concept in the objective space obviously still holds. In bi-objective optimization (assuming minimization) we say that [3, 5] is dominated by [2, 4] (because it is better in both objectives) and indifferent to [4, 2] (once better and once worse). In single-objective optimization we basically compare just a list with a single element which is the same as just comparing a scalar. We simply say 3 dominates 5.

Therefore, you can in general also run multi-objective optimization algorithms on a single-objective problem. However, you might find better single-objective algorithms which converge faster.

For instance, you can use the well-known NSGA-II algorithm to optimize a single-objective function using pymoo, a multi-objective optimization framework in Python.

Disclaimer: I am the main developer of pymoo.

The source code below uses NSGA-II (a multi-objective algorithm) to obtain the optimal solution for the Himmelblau function (a single-objective test problem):

from pymoo.algorithms.nsga2 import NSGA2
from pymoo.factory import get_problem
from pymoo.optimize import minimize

problem = get_problem("himmelblau")

algorithm = NSGA2(pop_size=20)

res = minimize(problem,
               algorithm,
               seed=1,
               verbose=True)

print(res.F) 

which finds F=0.00034225 in 2000 function evaluations.

However, if you use Hooke and Jeeves Pattern Search you will obtain the optimum much faster:

from pymoo.algorithms.so_pattern_search import PatternSearch
from pymoo.factory import get_problem
from pymoo.optimize import minimize

problem = get_problem("himmelblau")

algorithm = PatternSearch()

res = minimize(problem,
               algorithm,
               seed=1,
               verbose=True)

print(res.F) 

Found F=4.62182083e-18 in only 224 evaluations.

Julian
  • 416
  • 1
  • 4
  • 8