11

I have a short program here:

Given any n:
i = 0;
while (i < n) {
   k = 2;
   while (k < n) {
        sum += a[j] * b[k]
        k = k * k;
   }
   i++;
}

The asymptotic running time of this is O(n log log n). Why is this the case? I get that the entire program will at least run n times. But I'm not sure how to find log log n. The inner loop is depending on k * k, so it's obviously going to be less than n. And it would just be n log n if it was k / 2 each time. But how would you figure out the answer to be log log n?

Bill the Lizard
  • 398,270
  • 210
  • 566
  • 880
chrismanderson
  • 4,603
  • 5
  • 31
  • 47
  • For a more detailed explanation of WHY, here is a question to look for: http://stackoverflow.com/questions/487258/plain-english-explanation-of-big-o – Simon Mar 05 '11 at 02:17

4 Answers4

9

For mathematical proof, inner loop can be written as:

T(n) = T(sqrt(n)) + 1

w.l.o.g assume 2 ^ 2 ^ (t-1)<= n <= 2 ^ (2 ^ t)=>
we know  2^2^t = 2^2^(t-1) * 2^2^(t-1)
T(2^2^t) = T(2^2^(t-1)) + 1=T(2^2^(t-2)) + 2 =....= T(2^2^0) + t =>
T(2^2^(t-1)) <= T(n) <= T(2^2^t) = T(2^2^0) + log log 2^2^t = O(1) + loglogn

==> O(1) + (loglogn) - 1 <= T(n) <= O(1) + loglog(n) => T(n) = Teta(loglogn).

and then total time is O(n loglogn).

Why inner loop is T(n)=T(sqrt(n)) +1? first see when inner loop breaks, when k>n, means before that k was at least sqrt(n), or in two level before it was at most sqrt(n), so running time will be T(sqrt(n)) + 2 ≥ T(n) ≥ T(sqrt(n)) + 1.

Saeed Amiri
  • 22,252
  • 5
  • 45
  • 83
  • @Tymek, I updated the answer, hope this helps, but be careful that, inner loop is not sqrt(n) + 1, I proved inner loop is `log log n`, I said in inner loop we have `T(n) = T(sqrt(n)) + 1`. – Saeed Amiri Jun 21 '12 at 18:45
0

Time Complexity of a loop is O(log log n) if the loop variables is reduced / increased exponentially by a constant amount. If the loop variable is divided / multiplied by a constant amount then complexity is O(Logn).

Eg: in your case value of k is as follow. Let i in parenthesis denote the number of times the loop has been executed.

 2 (0) , 2^2 (1), 2^4 (2), 2^8 (3), 2^16(4), 2^32 (5) , 2^ 64 (6) ...... till n (k) is reached. 
The value of k here will be O(log log n) which is the number of times the loop has executed.

For the sake of assumption lets assume that n is 2^64. Now log (2^64) = 64 and log 64 = log (2^6) = 6. Hence your program ran 6 times when n is 2^64.

rombi
  • 199
  • 3
  • 22
0

I think if the codes are like this, it should be n*log n;

i = 0;
while (i < n) {
    k = 2;
    while (k < n) {
        sum += a[j] * b[k]
        k *= c;// c is a constant bigger than 1 and less than k;
    }
i++;
}
Yin Jin
  • 1
  • 1
  • 2
0

Okay, So let's break this down first -

Given any n:

i = 0;
while (i < n) {
k = 2;
   while (k < n) {
     sum += a[j] * b[k]
     k = k * k;
   }
  i++;
}
  1. while( i<n ) will run for n+1 times but we'll round it off to n times.

  2. now here comes the fun part, k<n will not run for n times instead it will run for log log n times because here instead of incrementing k by 1,in each loop we are incrementing it by squaring it. now this means it'll take only log log n time for the loop. you'll understand this when you learn design and analysis of algorithm

  3. Now we combine all the time complexity and we get n.log log n time here I hope you get it now.

JON
  • 965
  • 2
  • 10
  • 28
iZERWAS
  • 1
  • 1