0

I am having a hard time understanding the efficiency of an algorithm and how do you really determine that, that particular sentence or part is lg n, O (N) or log base 2 (n)?

I have two examples over here.

doIt() can be expressed as O(n)=n^2.

First example.

i=1
loop (i<n)
doIt(…)
i=i × 2 
end loop  

The cost of the above is as follows:

i=1                                ... 1
loop (i<n)                         ... lg n
doIt(…)                            ... n^2 lg n
i=i × 2                            ... lg n
end loop 

Second example:

static int myMethod(int n){
    int i = 1;
    for(int i = 1; i <= n; i = i * 2)
          doIt();
    return 1;
}

The cost of the above is as follows:

static int myMethod(int n){                      ... 1
    int i = 1;                                   ... 1
    for(int i = 1; i <= n; i = i * 2)            ... log base 2 (n)
          doIt();                                ... log base 2 (n) * n^2
    return 1;                                    ... 1
}

All this have left me wondering, how do you really find out what cost is what? I've been asking around, trying to understand but there is really no one who can really explain this to me. I really wanna understand how do I really determine the cost badly. Anyone can help me on this?

John
  • 177
  • 1
  • 4
  • 19
  • possible duplicate of [Big O, how do you calculate/approximate it?](http://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it) – Daniel Mošmondor Sep 10 '13 at 14:23
  • The way to think about it is "When the number n becomes very very large, and I then add 1 to the number n, how much longer will my algorithm take". There is usually one step / term that ends up dominating - that becomes the order. – Floris Sep 10 '13 at 14:24
  • 1
    Also, how can `loop (i – Boldizsár Németh Sep 10 '13 at 14:25
  • 1
    Read http://mitpress.mit.edu/books/introduction-algorithms (also, `loop(i – blgt Sep 10 '13 at 14:25
  • Your two examples look identical, except that one tests i – Darius X. Sep 10 '13 at 14:26
  • @BoldizsárNémeth its from the answers sheet. if not what it should be? – John Sep 10 '13 at 14:29
  • Pick any two bases *a* and *b*, and you can show that log_a n and log_b n differ by only a constant factor. As a result O(log_a n) is exactly the same as O(log_b n), so the base you use doesn't matter. – chepner Sep 10 '13 at 14:40
  • If you have problem with understanding why there is a logarithm remind logarithm definition: `log(base a) b = c <=> a^c = b.` In other words if n was for example 128 how many times this for loop can go? Exactly log(2) 128 = 7 times (2^7 = 128). Thus this loop runs O(logn) times because you multiply i by two in each iteration. However in Big-Oh notation you don't care about logarithm base as @chepner mentioned above. The whole loop is O(n^2 *logn) because you do O(n^2) operation O(logn) times. – fex Sep 10 '13 at 15:00
  • Yeah, thanks, I get it. The notation confused me, because I thought that the cost of doing one cycle (evaluation of the `i – Boldizsár Németh Sep 10 '13 at 17:29

2 Answers2

5

The big O notation is not measuring how long the program will run. It says how fast will the running time increase as the size of the problem grows.

For example, if calculating something is O(1), that could be a very long time, but it is independent of the size of the problem.

Boldizsár Németh
  • 1,847
  • 13
  • 20
2

Normally, you're not expecting to estimate costs of such things as cycle iterator (supposing storing one integer value and changing it N times is too minor to include in result estimation).

What really matters - that in terms of Big-O, Big-Theta e.t.c you're expected to find functional dependence, i.e. find a function of one argument (N), for which:

  • Big-O: entire algorithm count of operation grows lesser than F(N)
  • Big-Theta: entire algorithm count of operation grows equal to F(N)
  • Big-Omega: entire algorithm count of operation grows greater than F(N)

so, remember - you're not trying to find a number of operations, you're trying to find functional estimation for that, i.e. functional dependence between amount of incoming data N and some function from N, which indicates speed of growth for operation's count.

So, O(1), for example, indicates, that whole algorithm will not depend from N (it is constant). You can read more here.

Also, there are different types of estimations. You can estimate memory or execution time, for example - that will be different estimations in common case.

Alma Do
  • 37,009
  • 9
  • 76
  • 105