15

Lets say I take a computation that involves only addition and multiplication:

(a+b)*(c+d)

which can be done in many other ways, eg.

a*(c+d) + b*(c+d)
a*c + a*d + b*c + b*d

In terms of additions and multiplications the number of operations required for each of the three examples shown are (2,1) (3,2) (3,4) respectively. Clearly, if the goal is to reduce the total number of operations the first is superior. Is there a way, given an arbitrary expression to find the computation order that requires the least number of operations?

Note: This question is being re-asked from SE.math for the insight and perspective of the CS crowd.

Community
  • 1
  • 1
Hooked
  • 84,485
  • 43
  • 192
  • 261
  • Given the difficulties found in optimizing matrix multiplication, I strongly suspect that this is an unsolved problem. – RBarryYoung Dec 21 '10 at 02:06
  • If the goal is really to reduce the total number of operations then why would you use a computer to figure it out? I'm just sayin'. – Zac Thompson Dec 22 '10 at 00:39

5 Answers5

9

What you want is to effectively generate all possible equivalent algebraic expressions, and choose the one that takes the least costly number of steps (adding X three times is cheaper on most machines than multiplying X by 3).

It is impractical to do this, as the number of "equivalent" formulas is infinite.

However, Pelegrí-Llopart figured out a scheme to generate optimal code given a fixed number of algebraic rewrite rules, called "BURS" (bottom-up rewrite system). This has been implemented in some code generators.

In essence, he constructs offline a big automata whose states track the set of possible applied rewrites. Each state knows what to generate when it occurs, so the online time for code generation is cheap.

Ira Baxter
  • 93,541
  • 22
  • 172
  • 341
  • This is what I was looking for! I guess it's unsurprising that the answer came compiler optimization tricks. – Hooked Dec 22 '10 at 18:18
5

Ignoring powers of variables and integer coefficients, this reduces to a Karnaugh Map problem.

K-Maps can be represented in sum-of-products form and product-of-sums form, each of which represents a binary circuit. A "fewest operations" form would represent an optimized binary circuit, right?

Daniel
  • 855
  • 10
  • 21
  • I don't see the connection. Could you add some details? – Draco Ater Dec 22 '10 at 10:26
  • My point is that since circuit minimization problems are believed to be intractable [Wikipedia] that no procedural program will find an *optimal* solution in polynomial time... The binary expression (a+b)*(c+d) evaluates to TRUE if (either a OR b is TRUE) AND (either c OR d is TRUE). In this sense arithmetic which only involves + (OR) and * (AND) where each variable is either 0 or 1 represents a binary circuit. – Daniel Dec 22 '10 at 11:31
  • 2
    But, you can precompute optimal for specific sets offline, and apply that answer online. See my answer. – Ira Baxter Dec 22 '10 at 16:29
  • Wow, I'm surprised that such optimizations have already been implemented given the computation power and memory space required for this approach. – Daniel Feb 04 '11 at 07:30
  • 1
    Here **intractable** is a technical term meaning that if you come up with a solution that scales better than exponentially, all mathematicians on the planet will be more surprised than if God looked down from the sky and spoke to them. (Or, as mathematicians like to put it, while grinning, "it's still an open question whether this is possible or not".) But small instances can be solved simply, no problem. – Evgeni Sergeev Feb 17 '17 at 06:45
4

There is a Horner's Rule for the efficient evaluation of polynomials in monomial form. According to it, given a polynomial of degree n

p(x) = anxn + an-1xn-1 + ... + a1x1 + a0

Only n multiplications are needed (not n+(n-1)+(n-2)+...+1 = n(n+1)/2, as it may seem from the first glance). That is because the polinomial can be rewritten as

p(x) = (((anx + an-1)x + an-2)x + ... a1)x + a0

Draco Ater
  • 20,820
  • 8
  • 62
  • 86
1

One idea - let's consider the variables as boolean values and write the maxterm form link text

HamoriZ
  • 2,370
  • 18
  • 38
0

Not sure about the general case, but it does seem that factoring polynomials improves performance. An example from a distant comp sci course:

a * x^2 + b * x + c

is improved by factoring out x:

x * (a * x + b) + c
Chris
  • 1,713
  • 2
  • 12
  • 16
  • Clearly factoring out a polynomial is a good heuristic, however I am interested in the _best possible_ case, making this a well-defined (if hard?) minimization problem. – Hooked Dec 07 '10 at 21:37
  • Yes, I see. Your example and mine both give the best possible case by factoring to death. Another example: abc+bcd+ade; factors three ways: bc(a+d)+ade, a(bc+de)+bcd, d(bc+ae)+abc. Clearly more is needed. Assuming this is the right direction, I would brute force the optimization. Perhaps another will propose a more elegant solution. – Chris Dec 07 '10 at 22:01