find a list of the stationary points, of their values and locations, and of whether they are minima or maxima.
This is in general an unsolvable problem. Method 1 (symbolic) is appropriate for that, but for complicated functions there is no symbolic solution for stationary points (there is no method for solving a general system of two equations symbolically).
Symbolic solution with SymPy
For simple functions like your example, SymPy will work fine. Here is a complete example of finding the stationary points and classifying them by the eigenvalues of the Hessian.
import sympy as sym
x, y = sym.symbols("x y")
f = sym.cos(x*10)**2 + sym.sin(y*10)**2
gradient = sym.derive_by_array(f, (x, y))
hessian = sym.Matrix(2, 2, sym.derive_by_array(gradient, (x, y)))
So far, Hessian is a symbolic matrix 2 by 2: [[200*sin(10*x)**2 - 200*cos(10*x)**2, 0], [0, -200*sin(10*y)**2 + 200*cos(10*y)**2]]
. Next, we find the stationary points by equating gradient
to zero, and plug them into Hessian one by one.
stationary_points = sym.solve(gradient, (x, y))
for p in stationary_points:
value = f.subs({x: p[0], y: p[1]})
hess = hessian.subs({x: p[0], y: p[1]})
eigenvals = hess.eigenvals()
if all(ev > 0 for ev in eigenvals):
print("Local minimum at {} with value {}".format(p, value))
elif all(ev < 0 for ev in eigenvals):
print("Local maximum at {} with value {}".format(p, value))
elif any(ev > 0 for ev in eigenvals) and any(ev < 0 for ev in eigenvals):
print("Saddle point at {} with value {}".format(p, value))
else:
print("Could not classify the stationary point at {} with value {}".format(p, value))
The last clause is necessary because when the Hessian is only semidefinite, we cannot tell what kind of stationary point is that (x**2 + y**4
and x**2 - y**4
have the same Hessian at (0, 0) but different behavior). The output:
Saddle point at (0, 0) with value 1
Local maximum at (0, pi/20) with value 2
Saddle point at (0, pi/10) with value 1
Local maximum at (0, 3*pi/20) with value 2
Local minimum at (pi/20, 0) with value 0
Saddle point at (pi/20, pi/20) with value 1
Local minimum at (pi/20, pi/10) with value 0
Saddle point at (pi/20, 3*pi/20) with value 1
Saddle point at (pi/10, 0) with value 1
Local maximum at (pi/10, pi/20) with value 2
Saddle point at (pi/10, pi/10) with value 1
Local maximum at (pi/10, 3*pi/20) with value 2
Local minimum at (3*pi/20, 0) with value 0
Saddle point at (3*pi/20, pi/20) with value 1
Local minimum at (3*pi/20, pi/10) with value 0
Saddle point at (3*pi/20, 3*pi/20) with value 1
Obviously, solve
did not find all solutions (there are infinitely many of those). Consider solve vs solveset but in any case, handling infinitely many solutions is hard.
Numeric optimization with SciPy
SciPy offers a lot of numerical minimization routines, including brute force (which is your method 2; generally it's very very slow). These are powerful methods, but consider these points.
- Each run will only find one minimum.
- Replacing f with -f you can also find a maximum.
- Changing the starting point of the search (argument x0 of
minimize
) may yield another maximum or minimum. Still, you'll never know if there are other extrema that you have not seen yet.
- None of this will find saddle points.
Mixed strategy
Using lambdify
one can turn a symbolic expression into a Python function that can be passed to SciPy numeric solvers.
from scipy.optimize import fsolve
grad = sym.lambdify((x, y), gradient)
fsolve(lambda v: grad(v[0], v[1]), (1, 2))
This returns some stationary point, [0.9424778 , 2.04203522]
in this example. Which point it is depends on the initial guess, which was (1, 2). Typically (but not always) you'll get a solution that's near the initial guess.
This has an advantage over the direct minimization approach in that saddle points can be detected as well. Still, it is going to be hard to find all solutions, as each run of fsolve
brings up only one.