0

I am trying to solve the set of linear equations:

min || Ax - B ||^2
    for t in [0,1]

such that the coefficients x in this equation satisfy the linear equation:

C x = D

This system attempts to fit a set of Polynomials to approximate a function F(t) over the range of t.

  • A is a matrix, representing the map of the set of polynomials over the range of t values
  • x is a vector of coefficients (what I want) corresponding to a weight applied to each polynomial in A
  • B is a vector representing the F(t) values,
  • C is a matrix and D a vector, which together represent the boundary conditions on the coefficients of this system

This is a case of solving linear equations using the constraint of ordinary least squares.

While there are known closed form solutions e.g. Karush-Kuhn-Tucker I'm looking for a routing in scipy / numpy that can be used to solve this.

Research has shown the scipy.optimize module, which includes functions such as:

scipy.optimize.least_squares .

scipy.optimize.nnls .

scipy.optimize.lsq_linear .

The above is suggested both from this question and this question.

But these do not have conditions that work for a constraint of some other linear equation. What can I use in scipy and numpy to do this?

Community
  • 1
  • 1
Chuck
  • 3,664
  • 7
  • 42
  • 76
  • 1
    what is variable `t` – jf328 Feb 26 '19 at 11:07
  • @jf328 edited q – Chuck Feb 26 '19 at 11:12
  • 1
    Analytical solution should be the first to go. If you definitely want to use scipy routine, probably put the constraint as a penalization in the objective. `A2 = [A;m*C], B2 = [B;m*D]` and solve OLS `||A2.x-B2||^2` with large `m` – jf328 Feb 26 '19 at 11:24
  • Just use scipy.optimize.minimize (and provide your own gradient for stability). And if your t in [0,1] is a variable-bound, i don't see the closed-form solution mentioned here multiple times. – sascha Feb 28 '19 at 11:47

1 Answers1

1

For this you may use scipy.minimize(method='SLSQP'). The documentation is here.

You can define the equality constraint as a callable function with the signature:

def cons1(x):
    return sum(D - C*x)

SLSQP essential uses the constraint then to drive your optimisation problem. Note that this is a gradient based decent, so you will most likely find a local minima to high dimensional problems.

Another option is scipy.minimize(method=’trust-constr’), the documentation is here. These methods are natively implemented in python so the source code and modifications are accessible through.

If you have a smooth monotonic or context function, in my experience SLSQP should suffice.

newkid
  • 1,368
  • 1
  • 11
  • 27