0

z3 can solve optimisation problems using z3.Optimize. My goal is to apply it in the following way: Given a Boolean formula of propositional logic and a set of constraints, find all models that optimise the constraints, i.e. fulfil as many as possible. My ultimate goal is to select one of these models randomly.

Minimal example: Given the formula Or(x,y) and constraints Not(x) and Not(y), find the solutions [x: True, y: False] and [x: False, y: True].

While z3.Optimize.model() prints one optimal solution, it seems to be deterministic and to always print the solution [x: False, y: True]:

from z3 import *

x, y = Bools('x y')
s = Optimize()
s.add(Or(x,y))
s.add_soft(Not(x), weight="1")
s.add_soft(Not(y), weight="1")
s.check()
print(s.model())

When all_smt() provided by @alias is applied to this problem, list(all_smt(s, [x,y])) outputs:

[[x = False, y = True], [y = False, x = True], [y = True, x = True]]

So all_smt() finds all solutions, including the non-optimal [x: True, y: True].

What is the best way to find all optimal solutions? Alternatively, is it possible to find an optimal solution in a random way without generating all optimal solutions first?

1 Answers1

1

It's not quite clear what you mean by "optimal" in this context. It appears you allow the soft-constraints to be violated, but at least one of them must hold? (After all, ([x=True, y=True] is a valid solution, violating both your soft-constraints; which you did allow by explicitly calling all_smt.)

Assuming you are indeed looking for solutions that satisfy at least one of the soft-constraints, you should explicitly code that up and assert as a separate predicate. The easiest way to do this would be to use tracker variables:

from z3 import *

def all_smt(s, initial_terms):
    def block_term(s, m, t):
        s.add(t != m.eval(t, model_completion=True))
    def fix_term(s, m, t):
        s.add(t == m.eval(t, model_completion=True))
    def all_smt_rec(terms):
        if sat == s.check():
           m = s.model()
           yield m
           for i in range(len(terms)):
               s.push()
               block_term(s, m, terms[i])
               for j in range(i):
                   fix_term(s, m, terms[j])
               yield from all_smt_rec(terms[i:])
               s.pop()
    yield from all_smt_rec(list(initial_terms))

x, y = Bools('x y')
s = Optimize()

s.add(Or(x,y))

p1, p2 = Bools('p1 p2')
s.add(Implies(p1, Not(x)))
s.add(Implies(p2, Not(y)))

asserted = If(p1, 1, 0) + If(p2, 1, 0)
s.add(asserted >= 1)
s.minimize(asserted)

print([[(x, m[x]), (y, m[y])] for m in all_smt(s, [x,y])])

Here, we use p1 and p2 to track whether the soft-constraints are asserted, and we minimize the number of soft-constraints via the variable asserted yet requiring it should at least be 1. When run, the above prints:

[[(x, True), (y, False)], [(x, False), (y, True)]]

which seems to be what you wanted to achieve in the first place.

alias
  • 28,120
  • 2
  • 23
  • 40
  • Thanks! Sorry about the imprecision. By “optimal” I indeed meant “minimise violations of soft constraints”. – Felix Emanuel Feb 02 '22 at 15:41
  • No worries. Then the tracker-variable variant is exactly what you want. – alias Feb 02 '22 at 15:44
  • I seem to be getting the correct results by just asserting `asserted = If(Not(x), 1, 0) + If(Not(y), 1, 0)` and not adding the tracker variables at all. Now I'm wondering whether the method can work without the tracker variables, or whether this may have unintended consequences? – Felix Emanuel Feb 02 '22 at 19:29
  • 1
    For this particular case, yes; it appears you don't actually need trackers. Tracking variables are a general technique that you can use if you have a solver that doesn't offer optimization; though it can also make coding easier in certain cases. See https://stackoverflow.com/questions/70771595/how-to-bias-z3s-python-sat-solving-towards-a-criteria-such-as-preferring-t/70773903#70773903 for a similar problem and further pointers. – alias Feb 02 '22 at 20:14