3

I would like to know whether constrained blackbox optimization is possible using mystic in python. If so what algorithms will be available in this optimization package?

Sreenath
  • 51
  • 3

1 Answers1

3

I'm the author of mystic. Yes, blackbox constrained optimization is possible in mystic. To see more about it, look at the docs here: https://github.com/uqfoundation/mystic, and in the docs links therein.

In terms of constrained optimization, there's about 50 examples in the repo: https://github.com/uqfoundation/mystic/tree/master/examples2. Constraints can be symbolic or functional, equality and inequality, hard or soft, can be combined with and, or, not, and can be applied to any of the optimizers. Mystic's constraints are also portable, and can be applied to other optimization codes, like scipy.optimize, as well as machine learning codes like sklearn.

Mystic does not have a lot of optimizers, but the optimizers are very very customizable, and allows you to tweak almost every aspect of the optimization algorithm. Several more optimizers are for the most part developed, and will be added in a release this summer/fall.

Here's an explicit example from the above link:

"""
Maximize: f = 2*x[0]*x[1] + 2*x[0] - x[0]**2 - 2*x[1]**2
Subject to: -2*x[0] + 2*x[1] <= -2
             2*x[0] - 4*x[1] <= 0
               x[0]**3 -x[1] == 0
where: 0 <= x[0] <= inf
       1 <= x[1] <= inf
"""
import numpy as np
import mystic.symbolic as ms
import mystic.solvers as my
import mystic.math as mm

# generate constraints and penalty for a nonlinear system of equations 
ieqn = '''
   -2*x0 + 2*x1 <= -2
    2*x0 - 4*x1 <= 0'''
eqn = '''
     x0**3 - x1 == 0'''
cons = ms.generate_constraint(ms.generate_solvers(ms.simplify(eqn,target='x1')))
pens = ms.generate_penalty(ms.generate_conditions(ieqn), k=1e3)
bounds = [(0., None), (1., None)]

# get the objective
def objective(x, sign=1):
  x = np.asarray(x)
  return sign * (2*x[0]*x[1] + 2*x[0] - x[0]**2 - 2*x[1]**2)

# solve    
x0 = np.random.rand(2)
sol = my.fmin_powell(objective, x0, constraint=cons, penalty=pens, disp=True,
                     bounds=bounds, gtol=3, ftol=1e-6, full_output=True,
                     args=(-1,))

print('x* = %s; f(x*) = %s' % (sol[0], -sol[1]))
Mike McKerns
  • 33,715
  • 8
  • 119
  • 139
  • Thanks. I see that for local optimization, there are two solvers - Powell and Nelder-Mead. I hope more algorithms such as SLSQP will get added soon. These algorithms are available in scipy.optimize. One good thing about mystic is it also has population based algorithms such as differential evolution which is not there in scipy.optimize. Looking forward for more optimizers in mystic! – Sreenath Jun 14 '20 at 10:22
  • There are several that have been developed in an old branch, and they are targeted to be updated and released. I have found, however, that with the high level of customization of the solvers that is possible in mystic, the need for other solvers is less. Regardless, more solvers are "coming soon". – Mike McKerns Dec 28 '20 at 13:51