7

I was wondering if there is a nice way (preferably using JuMP) to get all optimal solutions of a linear program (in case there are multiple optimal solutions).

An example

minimize the statistical distance (Kolmogorov distance) between two probabilities distributions.

min sum_{i=1}^{4} |P[i] - Q[i]| over free variable Q
P = [0.25,0.25,0.25,0.25]
sum_i P[i] = 1
Q[1] + Q[4] = 1
sum_i Q[i] = 1 -> Q[2],Q[3] = 0

Note we can phrase the optimization as a linear program, the objective becomes

min S >= sum_i S[i]
S[i] >= P[i]-Q[i]
S[i] >= Q[i]-P[i]

There is no unique solution to this problem, instead the subspace of optimal solution is spanned by

Q1 = [0.75,0,0,0.25]
Q2 = [0.25,0,0,0.75]

Both have the minimal distance of 0.5, any convex combination of the these two solution is optimal.

I was wondering if there is a nice way to find all these optimal extreme points (points that span the optimal subspace)?

Why am I interested in this; the points that gives the maximal Bhattacharyya coefficient (concave function), lies somewhere in the middle of the optimal subspace of the statical distance.

So far I`ve tried to find optimal P,Q pairs (refering to example I gave) by making the algorithm favor miniziming the distance between P[i],Q[i], by adding a weight of 1.001 to this term in the sum. It seems to work to some extend, although I can hardly know for sure.

balletpiraat
  • 206
  • 1
  • 11
  • 1
    You can probably solve the LP and then try to maximise the Bhattacharyya coefficient given the objective function value of the LP that you solved. Even if you have all optimal solutions (vertices), it is unclear how you are going to find the optimal faces of the underlying polyhedron, and how you will perform the search (over these faces) so as to maximise the Bhattacharyya coefficient. If the optimal solution lies "in the middle", which can happen because the function is concave, the optimal vertices themselves are of little use. – Ioannis Apr 03 '16 at 10:52
  • 1
    I tried to edit the question directly, but apparently I cannot; in your linear program, the absolute value must be decomposed for each term of the objective function, that is `min sum A[i]` subject to `A[i] >= P[i] - Q[i]` and `A[i] >= Q[i] - P[i]` for each `1 <= i <= 4` – Nicolas Grebille Apr 03 '16 at 14:13
  • I fixed the mistake, thank you ;-) – balletpiraat Apr 06 '16 at 20:55
  • Note that there may be a HUGE number of vertices: for example, the [Klee-Minty cube](https://en.wikipedia.org/wiki/Klee-Minty_cube) is a D x D LP with 2^D vertices. However I don't know how many can have the same c . x . – denis Oct 13 '19 at 15:04

3 Answers3

5

There is an interesting way to enumerate all possible optimal LP solutions (or rather all optimal LP bases) using a standard MIP solver. Basically the algorithm is:

step 1. solve LP/MIP
step 2. if infeasible or if objective starts to deteriorate: stop
step 3. add cuts (constraints) to the model to forbid current optimal solution
step 4. goto step 1 

For an example see here.

Erwin Kalvelagen
  • 15,677
  • 2
  • 14
  • 39
  • Thank you! We will try this approach :) Seems very straightforward, curious how many bfs we will find! – balletpiraat Apr 06 '16 at 20:53
  • Now that I'm trying to implement such a scheme I`m not quite sure if I quite get what you are saying; do we have to assume our solutions are 0-1 integeter solutions? Or do we use 0-1 integers to keep track of the basic feasible points we already know? – balletpiraat Apr 11 '16 at 09:32
  • What is leading you to believe the solution values for x can only be 0-1? – Erwin Kalvelagen Apr 11 '16 at 15:59
  • Thats not what I believe. However I was led to believe this should somehow be the case when looking at [link](http://yetanothermathprogrammingconsultant.blogspot.nl/2011/10/special-case-of-integer-cuts.html). I`m not sure which kind of cuts would be suitable in the problem I put forward. – balletpiraat Apr 13 '16 at 11:41
  • In the simple case I can assume my variables Q[i] are 0-1 on the extreme points, I can give this information to my solver. In that case the cuts you proposed: sum_{i s.t. knownP[i] = 1} Q[i} <= n - 1, work like a charm. In the case where 0<= Q[i] <= 1 on the extreme points, it is unclear to me what cuts to make, to find all the extreme points. Intuitively the cut should go through another extreme point, to make sure we dont get a solution on a vertex in between extreme points. – balletpiraat Apr 13 '16 at 13:10
  • I think you totally missed the whole point about this algorithm. The variables x are continuous. The variables beta (used to encode the basis status) are binary. We use cuts on the basis variables beta. – Erwin Kalvelagen Apr 13 '16 at 13:15
4

LP solvers are not designed to enumerate all optimal solutions. Once you know the optimal objective value, you can define the polyhedron containing all optimal solutions and then use a vertex enumeration algorithm to collect the possibly very large set of extreme points of this polyhedron. All optimal solutions are convex combinations of these extreme points. From Julia, you could use the wrapper for cdd.

mlubin
  • 943
  • 5
  • 10
3

I don't know about julia, but there is a tool called PPL that you can use to determine all the vertices of the solution polyedron after you solved the linear program.

See my answer here to a similar question: Find all alternative basic solutions using existing linear-programming tool.

Community
  • 1
  • 1
Nicolas Grebille
  • 1,332
  • 8
  • 15