5

I've been looking around for a nonlinear constrained optimization package for Python (to deal with problems that are NOT necessarily convex) that can directly handle matrix variables. More specifically, I'm dealing with optimization problems where the optimization variables are matrices, and where there are equality constraints with both sides of the equations being matrices. An example of such an optimization problem is the Orthogonal Procrustes problem (https://en.wikipedia.org/wiki/Orthogonal_Procrustes_problem).

In my search I have come across SciPy, pyOpt, Ipopt and GEKKO, but neither of them seem to directly support matrix variables (from the documentation I was able to find). I have considered doing some maneuvering to convert the matrices into vectors when necessary and vice versa (through numpy.reshape or something similar), but I would like to avoid that option as much as possible. The reason for this is that my problems are fairly large, and constantly reshaping arrays would significantly harm the efficiency of the optimization procedure.

Any suggestions?

  • 1
    `numpy` reshape is (usually) cheap. But what's special about your 'matrix variables'? Just 2d shape? – hpaulj Jan 27 '20 at 02:23
  • 1
    Pyomo allows indexed variables and supports global nonlinear solvers. – Erwin Kalvelagen Jan 27 '20 at 02:24
  • @hpaulj There's nothing special about the matrix variables. I think I could transform the problems so that they are defined from the perspective of '1D arrays'. I just wanted to see if there was a way of formulating the problems in Python as it's done in CVX for Matlab (see for example http://stanford.edu/class/ee363/notes/lmi-cvx.pdf). – DOspinaAcero Jan 27 '20 at 18:46
  • Are your large matrices also sparse? It may be beneficial to find packages that exploit sparsity of the underlying matrix operations. I've used the sparse matrices in Gekko and they work well for large-scale problems at least for state-space models. https://apmonitor.com/wiki/index.php/Apps/LinearStateSpace It sounds like you need something where the matrices themselves are variables. – TexasEngineer Feb 02 '20 at 00:52
  • 1
    @TexasEngineer The matrices may or may not be sparse, but I appreciate you pointing out the possibility of exploiting that reality when applicable. And about the last part of your comment, yes, in the problems that I have been considering, the optimization variables are matrices. – DOspinaAcero Feb 03 '20 at 15:19

1 Answers1

0

Here is a problem with matrices:

 min(sum(sum(B))
 s.t. AX=B
      sum(sum(A))=5 
      sum(sum(X))=2 

It is configured using the m.Array method in Python GEKKO with A, X, and B as 2D matrices but they could be higher dimensional as well.

from gekko import GEKKO
import numpy as np
m = GEKKO(remote=False)
ni = 3; nj = 2; nk = 4
# solve AX=B
A = m.Array(m.Var,(ni,nj),lb=0)
X = m.Array(m.Var,(nj,nk),lb=0)
AX = np.dot(A,X)
B = m.Array(m.Var,(ni,nk),lb=0)
# equality constraints
m.Equations([AX[i,j]==B[i,j] for i in range(ni) \
                             for j in range(nk)])
m.Equation(5==m.sum([m.sum([A[i][j] for i in range(ni)]) \
                                    for j in range(nj)]))
m.Equation(2==m.sum([m.sum([X[i][j] for i in range(nj)]) \
                                    for j in range(nk)]))
# objective function
m.Minimize(m.sum([m.sum([B[i][j] for i in range(ni)]) \
                                 for j in range(nk)]))
m.solve()
print(A)
print(X)
print(B)

You mentioned non-convexity so you may need to use a multi-start or other method to find a global solution in Gekko or else use a global optimizer in a different Python package. The axb or qobj object are valuable if some of your constraints are linear equation or quadratic objectives with constant matrices. You can use those for large-scale and sparse systems. The APMonitor and Gekko papers also review some of the other Python packages for optimization.

John Hedengren
  • 12,068
  • 1
  • 21
  • 25
  • 1
    thanks for the information. Your example offers a few ideas about how to implement certain things with GEKKO. However, it's not quite what I was after, as the linear algebra part is not incorporated. From your answer (and from the documentation that I've seen about the different optimization packages in Python), it seems that there's no option but to formulate the problems with the "entry-wise" mentality. – DOspinaAcero Jan 29 '20 at 15:55
  • I think the issue is that most optimization packages perform operator overloading for automatic differentiation and some of the matrix operations are not supported for AD. Scipy.optimize.minimize supports most of the matrix operations but will be much slower for large-scale problems because it does not require exact derivatives. – John Hedengren Jan 29 '20 at 21:43