8

Possible Duplicate:
The most efficient way to implement an integer based power function pow(int, int)

How can I calculate powers with better runtime?

E.g. 2^13.

I remember seeing somewhere that it has something to do with the following calculation:

2^13 = 2^8 * 2^4 * 2^1

But I can't see how calculating each component of the right side of the equation and then multiplying them would help me.

Any ideas?

Edit: I did mean with any base. How do the algorithms you've mentioned below, in particular the "Exponentation by squaring", improve the runtime / complexity?

Community
  • 1
  • 1
Meir
  • 12,285
  • 19
  • 58
  • 70
  • 7
    duplicate: http://stackoverflow.com/questions/101439/the-most-efficient-way-to-implement-an-integer-based-power-function-powint-int – Nick Dandoulakis Feb 04 '10 at 08:14
  • "Exponentation by squaring" calculates `base^exp` in `log(exp)` steps, where *log* is the logarithm with base 2. – Nick Dandoulakis Feb 04 '10 at 09:01
  • 1
    @Nick D, I know I state that in my answer, but I've realized I'm slightly wrong. It's basically correct if you're using standard integers. But once you get to using bignums it becomes basically `O(log(n)^2)` because the multiplies take more than O(1) time. – Omnifarious Feb 04 '10 at 09:21
  • @Omnifarious, I said log(exp) steps, I didn't specify the *O*. I agree with you that if we take account of the "multiplication" operation the actual runtime complexity may not be O(logn). – Nick Dandoulakis Feb 04 '10 at 09:41

6 Answers6

16

There is a generalized algorithm for this, but in languages that have bit-shifting, there's a much faster way to compute powers of 2. You just put in 1 << exp (assuming your bit shift operator is << as it is in most languages that support the operation).

I assume you're looking for the generalized algorithm and just chose an unfortunate base as an example. I will give this algorithm in Python.

def intpow(base, exp):
   if exp == 0:
      return 1
   elif exp == 1:
      return base
   elif (exp & 1) != 0:
       return base * intpow(base * base, exp // 2)
   else:
       return intpow(base * base, exp // 2)

This basically causes exponents to be able to be calculated in log2 exp time. It's a divide and conquer algorithm. :-) As someone else said exponentiation by squaring.

If you plug your example into this, you can see how it works and is related to the equation you give:

intpow(2, 13)
2 * intpow(4, 6)
2 * intpow(16, 3)
2 * 16 * intpow(256, 1)
2 * 16 * 256 == 2^1 * 2^4 * 2^8
dan04
  • 87,747
  • 23
  • 163
  • 198
Omnifarious
  • 54,333
  • 19
  • 131
  • 194
10

Use bitwise shifting. Ex. 1 << 11 returns 2^11.

jhchen
  • 14,355
  • 14
  • 63
  • 91
3

Powers of two are the easy ones. In binary 2^13 is a one followed by 13 zeros.

You'd use bit shifting, which is a built in operator in many languages.

Karl
  • 8,967
  • 5
  • 29
  • 31
3

You can use exponentiation by squaring. This is also known as "square-and-multiply" and works for bases != 2, too.

SebastianK
  • 3,582
  • 3
  • 30
  • 48
  • 3
    It's really easy to link to wikipedia, and I think links to wikipedia make great supplements to answers, but a link to wikipedia is not an answer except when the answer is really too huge to write down here. – Omnifarious Feb 04 '10 at 08:24
  • 7
    Why invent the wheel twice? Often, the only crucial thing is to get the right keyword to name a problem. – SebastianK Feb 04 '10 at 08:27
1

If you're not limiting yourself to powers of two, then:

k^2n = (k^n)^2

Thom Smith
  • 13,916
  • 6
  • 45
  • 91
1

The fastest free algorithm I know of is by Phillip S. Pang, Ph.D and can the source code can be found here. It uses table-driven decomposition, by which it is possible to make exp() function, which is 2-10 times faster, then native exp() of Pentium(R) processor.

Suraj Chandran
  • 24,433
  • 12
  • 63
  • 94