8

Disclaimer: Homework question. I'm looking for a hint…

Professor F. Lake tells his class that it is asymptotically faster to square an n-bit integer than to multiply two n-bit integers. Should they believe him?

I believe that multiplying two n-bit ints via shift/add is an O(n) operation, but I can't see why squaring an n-bit int would be any different. Am I missing something?

Bill the Lizard
  • 398,270
  • 210
  • 566
  • 880
Student
  • 81
  • 1
  • 2
  • 2
    The shift and add operations are both O(n). Since multiplying two n-bit numbers will require n shift/add operations, a multiply is O(n^2). – Gabe Sep 29 '10 at 06:18
  • see related [Fast bignum square computation](https://stackoverflow.com/q/18465326/2521214) – Spektre Sep 10 '19 at 07:04

6 Answers6

18

Since you wanted only a hint, answer comes from this equation: (a + b)^2 = a^2 + b^2 + 2*a*b

To not spoil the puzzle, I've posted complete solution separately :)

Nikita Rybak
  • 67,365
  • 22
  • 157
  • 181
  • Study this equation well. IMO, its too big of a hint and makes this problem too easy... but maybe I'm thinking like that because I already know the answer. – Dragontamer5788 Sep 29 '10 at 20:56
11

Imagine that squaring is actually asymptotically faster. Then if you have a * b, you could calculate:

a = m + n
b = m - n

Then solving this equation system gives:

m = (a+b)/2
n = (a-b)/2

But then we have

a * b = (m+n)*(m-n) = m² - n²

or without intermediate variables:

a * b = ((a+b)² - (a-b)²)/4

So you can replace any multiplication by two squaring operations (and some additions and division by 4, which is just a bit shift, and these can be all ignored for asymptotical complexity). So the complexity of multiplication is at most twice the complexity of squaring. Of course, "twice" is a constant factor, which means both have the same asymptotical complexity.

Eric
  • 95,302
  • 53
  • 242
  • 374
Landei
  • 54,104
  • 13
  • 100
  • 195
1

Here's a hint.

And here's my solution in SECRET CODE:Fdhnevat zrnaf lbh bayl unir gb qb bar vavgvny SG, abg gjb, fb vg'f snfgre.

mtrw
  • 34,200
  • 7
  • 63
  • 71
0

Consider the steps the computer needs to take in order to accomplish these tasks. Remember that computers work drastically different from people.

Frank V
  • 25,141
  • 34
  • 106
  • 144
0

My thought is that to multiply two n-bit integers your algorithm needs to cater for any two n-bit integers. That's (2^n)^2 possible inputs.

A squaring algorithm only needs to handle 2^n possible inputs, although it can be modelled as a multiply algorithm with two inputs the same.

My guess is that there would be some way to optimise the generic multiply algorithm when you know that both inputs will be the same, but I'd have to think about it. That's the line I'd be investigating, anyway...

Andrew Cooper
  • 32,176
  • 5
  • 81
  • 116
0

Rewritten: This is the only improvement that I can see in squaring a n-bit number over multiplying two n-bit numbers together. It may not be asymptotically better in the O(n^2) vs. O(n) sort of way that is commonly used in computer science. However, if we take it asymptotically literally meaning the complexity that is approached (including the multiplicative constants), then this will fit that definition. Anyway, it's all that I can see to do so take it or leave it.

Let's say that we have two N-bit numbers, x and y. We can multiply them together (x*y) with the shift-and-add method with A*N^2 + O(N) operations where A is a constant. The second term, the O(N) term, can be disregarded for large enough N so the number of operations is essentially A*N^2.

Now we calculate x^2. If we define a to have only the upper N/2 bits of x set in it and b to have only the lower N/2 bits of x set in it, then

x = a + b

x^2 = (a + b)^2 = a^2 + b^2 + 2*a*b

However, remember that we can multiply a N-bit number with A*N^2 operations. To multiply a*a we only have to do A*(N/2)^2 = A*N/4 operations. The same goes for b*b and a*b. If we ignore the O(N) operations, then x^2 = (a + b)^2 is calculated in

A*N^2/4 + A*N^2/4 + A*N^2/4 = (3/4)*A*N^2

operations which is of course better than the standard A*N^2 for multiplying two arbitrary N-bit numbers by A*N^2/4. We can further improve on this by repeating the same operation with a^2 and b^2. At some point it will not be beneficial to keep doing this. This is not an enormous improvement, but it's all that I can find. You can decide for yourselves if this counts or not.

Justin Peel
  • 46,722
  • 6
  • 58
  • 80
  • Though I realized that it is only an improvement if the O(n^2) method for multiplying two numbers is considered. – Justin Peel Sep 29 '10 at 06:36
  • 1
    If professor is correct, then there's an algorithm to calculate square of a number satisfying requirement above (asymptotically faster than _any_ algorithm to multiply two arbitrary numbers). Where is it? – Nikita Rybak Sep 29 '10 at 06:57
  • It actually doesn't imply that. Realize that by splitting up into x = a + b, as I've said, that a^2, b^2 and a*b will all take f(n/2) operations. Since f(n) = A*n^2+B*n operations.. do the math. I can be even more obvious if no one can see it. – Justin Peel Sep 29 '10 at 14:26
  • An algorithm will have an arbitrarily large speed-up over an algorithm it is asymptotically faster than for sufficiently large input size. Are we using the same definition for "asymptotically faster" here? – Nabb Sep 29 '10 at 14:30
  • @Nabb asymptotically faster doesn't necessarily mean that an enormous speed-up (O(n^2) versus O(n)). For instance, the dual-pivot quicksort that came out a while ago is still O(N log N), but the coefficient on it is less asymptotically. In other words, for large enough N, the dual-pivot quicksort is faster. The same goes here. Asymptotically faster just means that as N goes to infinity it is faster. At least, that's what I've always understood. Usually though, when people speak of asymptotically faster, they mean something like O(N) versus O(N^2) though. My solution fails by that definition. – Justin Peel Sep 29 '10 at 15:26
  • The professor is clearly wrong: If you look at my answer, you can see that one multiplication can be transformed to two squaring operations, so the asymptotical complexity is the same. – Landei Sep 29 '10 at 17:40
  • @Justin _"My solution fails by that definition"_ Yes, and "that" definition is the academic definition. (And I still can't follow what exactly your equation proves, sorry.) – Nikita Rybak Sep 29 '10 at 18:14
  • @Landei @Nikita Take a read of my rewritten solution. Hopefully you at least get what I was trying to show. You may not consider a good enough improvement, but hopefully you'll at least understand it. It doesn't refute Landei's claim either because squaring two numbers and subtracting the results isn't faster than doing a straight multiplication. – Justin Peel Sep 29 '10 at 20:34