20

Is O(log(log(n))) actually just O(log(n)) when it comes to time complexity?
Do you agree that this function g() has a time complexity of O(log(log(n)))?

int f(int n) {
    if (n <= 1)
        return 0;
    return f(n/2) + 1;
}

int g(int n) {
    int m = f(f(n));
    int i;
    int x = 0;
    for (i = 0; i < m; i++) {
        x += i * i;
    }
    return m;
}
chqrlie
  • 131,814
  • 10
  • 121
  • 189
  • 1
    `O(log log n)` is `O(log n)` but not the other way around. – Pål GD Feb 15 '16 at 11:48
  • @PålGD Do you mean they are actually the same, or that O(log n) is an upper bound for O(log(log(n)))? – SomeoneWithAQuestion Feb 15 '16 at 12:28
  • 2
    The use of _is_ in [Landau notation](https://en.wikipedia.org/wiki/Big_O_notation) refers to class membership, or in this case, class containment. The class of all functions in `O(log log n)` is a subclass of the `O(log n)`. But you can also say that `log n` is an upper bound of `log n`, yes. – Pål GD Feb 15 '16 at 12:54

4 Answers4

28

function f(n) computes the logarithm in base 2 of n by repeatedly dividing by 2. It iterates log2(n) times.

Calling it on its own result will indeed return log2(log2(n)) for an additional log2(log2(n)) iterations. So far the complexity is O(log(N)) + O(log(log(N)). The first term dominates the second, overall complexity is O(log(N)).

The final loop iterates log2(log2(n)) times, time complexity of this last phase is O(log(log(N)), negligible in front of the initial phase.

Note that since x is not used before the end of function g, computing it is not needed and the compiler may well optimize this loop to nothing.

Overall time complexity comes out as O(log(N)), which is not the same as O(log(log(N)).

chqrlie
  • 131,814
  • 10
  • 121
  • 189
  • So when it comes to function composition you seperate the time complexity of each and add them, not multiply them? – SomeoneWithAQuestion Feb 15 '16 at 08:45
  • 1
    @SomeoneWithAQuestion: for simple cases like this, if you can compute the number of operations for a phase, this gives the time complexity. You then add the complexity of separate phases, and simplify by keeping only the most significant term. – chqrlie Feb 15 '16 at 08:51
  • 5
    I would note that while O(log N) is not equivalent to O(log log N), an O(log log N) function is also an O(log N) function (the O bound does not have to be tight). – Matthieu M. Feb 15 '16 at 12:22
7

Looks like it is log(n) + log(log n) + log(log n).

In order: the first recursion of f(), plus the second recursion of f(), and the for loop, so the final complexity is O(log n), because lower order terms are ignored.

Mazdak
  • 105,000
  • 18
  • 159
  • 188
2501
  • 25,460
  • 4
  • 47
  • 87
3
int f(int n) {
    if (n<=1)
        return 0;
    return f(n/2) + 1;
} 

Has Time Complexity of Order O(log2(n)). Here 2 is base of logrithm.

int g(int n) {
    int m = f(f(n)); // O(log2(log2(n))
    int i, x=0;
    for( i = 0; i < m; i++) {
        x += i*i;
    }
    // This for loop will take O(log2(log2(n))
   return m;
}

Hence overall time complexity of given function is : T(n) = t1 + t2 + t3

But here O(log2(n)) dominates over O(log2(log2(n)).

Hence time complexity of given function is log2(n).

Please read What is a plain English explanation of "Big O" notation? once.

Community
  • 1
  • 1
Shravan40
  • 8,922
  • 6
  • 28
  • 48
1

The time consumed by O(log n) algorithms depends only linearly on the number of digits of n. So it is very easy to scale it.

Say you want to compute F(100000000), the 10^8th F....ascci number. For a O(log n) algorithm it is only going to take 4x the time consumed by computing F(100).

O(log log n) terms can show up in a variety of different places, but there are typically two main routes that will arrive at this runtime. Reference link enter code here here.

Community
  • 1
  • 1
msc
  • 33,420
  • 29
  • 119
  • 214