2

The following function calculates a^b. assume that we already have a prime_list which contain all needed primes and is sorted from small to large. The code is written in python.

def power(a,b):
    if b == 0:
        return 1

    prime_range = int(sqrt(b)) + 1
    for prime in prime_list:
        if prime > prime_range:
            break
        if b % prime == 0:
            return power(power(a, prime), b/prime)

    return a * power(a, b-1)

How to determine its time complexity? p.s. The code isn't perfect but as you can see the idea is using primes to reduce the number of times of arithmetic operations. I am still looking for an ideal implementation so please help if you come up with something. Thx!

  • What language are you using? You can measure the running time programmatically. – Tim Biegeleisen Aug 08 '15 at 12:25
  • Even sqrt() of your language is implementation based. One couldn't find the overall complexity of this function unless you provide the needed details. – Am_I_Helpful Aug 08 '15 at 12:26
  • Do you really want to measure the time complexity? Opposed to "How to determine time complexity" or "How to measure running time"? – Stef Aug 08 '15 at 12:47
  • The language that you use seems to be Python. Described algorithm have O(n) complexity. However, to measure time running time you can import timeit, as you can see in http://stackoverflow.com/questions/2662140/how-to-measure-running-time-of-algorithms-in-python. – Mihai8 Aug 08 '15 at 12:50
  • @Stef Sorry English no good :-( –  Aug 08 '15 at 13:38
  • @YZhQ Looking at the comment on my answer, you'd save some time by doing `a * power(a, prime - 1)` instead of `power(a, prime)`. – Holt Aug 08 '15 at 14:03

1 Answers1

1

Worst case when for loop is exausted. But in this case b becomes divided by 2 in next recursive call.

In worst case we devide b by factor 2 in approx sqrt(b) operations in each step until b reach 1.
so if we set equations

f(1) = 1 and f(n) = f(n/2) + sqrt(n)

we get using woflram alpha

f(n) = (1+sqrt(2)) (sqrt(2) sqrt(n)-1)
and that is
O(sqrt(b))

Luka Rahne
  • 10,336
  • 3
  • 34
  • 56
  • 1
    If you want the actual proof for the recursion (without wolfram alpha), you simply have `sqrt(n) + sqrt(n/2) + sqrt(n/4) + sqrt(n/(2^log2(n)))` (because `2^log2(n) = 1`), which is simply `sqrt(n) * sum((1/sqrt(2))^I)` with `I = 1..log2(n)`, using the formula for the sum of a geometric serie, you get the desired answer ;) – Holt Aug 08 '15 at 14:13
  • If `b` is prime, it won't be divided by 2. – Juan Lopes Aug 08 '15 at 14:17
  • @Juan Lopes if b is prime, then first loop will be exausted (there is condition to check against primes < sqrt(b) ) and at the end (last line of algorhitm) after sqrt(b) operations, b becomes b-1 and will be divisible by 2. – Luka Rahne Aug 08 '15 at 14:19
  • 1
    @LukaRahne I think your algorithm give a pretty good approximation to the average case, but the worst case is something else: You need to find `f` such as `f(2p+1) = f(2p) + sqrt(2p+1)` and `f(2p) = f(p) + sqrt(2p)`, this happens if you have a prime `b` where `(b - 1) / 2` is also prime (and so on until you get to 1), I don't know if such numbers exist, but this could happen (`23` almost has this property since you get `23 -> 22 -> 11 -> 10 -> 5 -> 4 -> 2 -> 1`) ;) – Holt Aug 08 '15 at 14:22
  • @LukaRahne I see, but if we have a number in the form `2^k-1` it'll have O(k) times when the loop will have to be exhausted. (e.g 31->30->15->14->7->6->3->2->1->0). In this case there will be O(log(n) * sqrt(n)) operations. – Juan Lopes Aug 08 '15 at 14:26
  • so if there is bad case it is when it gets devided by "large prime" i guess. f(n) = f(n/sqrt(n)) + sqrt(n) – Luka Rahne Aug 08 '15 at 14:27
  • @Juan Lopes i think that this case is covered in anwer. It takes in account that each loop n gets smaller and is also explained by Hott in first comment of this anser. – Luka Rahne Aug 08 '15 at 14:43
  • @LukaRahne I think that what JuanLopes want to say is that in your example, you take the case that the number is an exact power of 2 (always divisible by 2), and you assume this is the worst case, while we say that it is not (if you got a strange sequence with a prime number every two step, this is the worst case IMO). – Holt Aug 08 '15 at 17:33
  • @JuanLopes Your example is incorrect, should be `31 -> 30 -> 15 -> 5 -> 4 -> 2 -> 1 -> 0`, but without any certitude, I'd say like you that the complexity is `O(log(n) * sqrt(n))`. – Holt Aug 08 '15 at 17:35
  • I my case I presumed that after each step one gets prime number. and after that number divisible by 2 and then prime again. Other possibility for bad case is that for loop is evaluated in last loop at each step, Condition is recursevly expressed like f(n) = 2* f(n/sqrt(n)) + sqrt(n) . This one is at least as good as f(n) = 2* f(n/2) + sqrt(n) that is O(sqrt(n)) again – Luka Rahne Aug 08 '15 at 17:57