1

I am finishing up an assignment for my class, this particular section involves giving an "analysis" of the running time for several for loops. The instructor specified that he wants a Big Oh Notation answer.

I am building on the foundation that total running time is based on:

1)The cost of executing each statement
2)The frequency of execution of each statement
3)The idea of working from "inside and out", specifically starting from inner loops.

I know that:

total = 0;

   for (int i = 0; i < n;i++){
    total++;
}

Answer: O(N) its running linear.

for (int i = 0; i < n;++i)
     for (int j = 0; j < n;++j)
     total++;

Answer: O(N^2), we only care about how large N grows.

I am confused on

for (int i = 0;i < n;++i)
     for (j=0;j < n * n;++j)
     ++total;

and

for ( i = 0;i < n;++i)
     for (j = 0; j < i;++j)
     ++total;

And last but not least, I am assuming from my textbook that all Triple nested loops are running at N^3 time?

SkyZ
  • 55
  • 1
  • 10
  • 4
    The first one runs in O(n^3) and the second one runs in O(n^2) – Ra1nWarden Mar 23 '16 at 21:02
  • These loops are simple enough that you can get it by experimentation. You can try different `n` values (in a loop) and deduce the order. e.g. try n=10 and n=20 and see what the ratio of the `total` is. – Peter Lawrey Mar 23 '16 at 21:12
  • more precisely the last one is `= (n * (n-1)) / 2 = (n^2 - n) / 2 € O(n^2)` – Rocki Mar 23 '16 at 21:17

1 Answers1

4

You can analyze your algorithms using Sigma notation, to count/expand the number of iterations run by the inner for loop of your algorithms:

enter image description here

Where T_a covers

for (i = 0; i < n; ++i)
    for (j = 0; j < n * n; ++j)
        ++total;

and T_b:

for (i = 0; i < n; ++i)
    for (j = 0; j < i; ++j)
        ++total;

Finally, a note on your question:

"And last but not least, I am assuming from my textbook that all Triple nested loops are running at N^3 time?"

This is not true: it depends on how the iterate is increased as well as bounded in the signature of each loop. Compare e.g. with the inner loop in T_a above (bounded by n^2, but simply increased by 1 in each iteration) or e.g. the algorithm analyzed in this answer, or, for a slightly trickier case, the single loop analyzed in this answer.

Community
  • 1
  • 1
dfrib
  • 70,367
  • 12
  • 127
  • 192
  • @SkyZ Happy to help. Sigma notation is really just a helpful tool for estimating the actual number of iterations of some given algorithm, from which the time complexity naturally follows. – dfrib Mar 23 '16 at 23:08
  • Thanks for showing me the technique, my professor didn't mention anything about sigmas. Where does the divide by 2 come from for T(B)??? Im confused with the second one actually the entire T_b – SkyZ Mar 23 '16 at 23:19
  • @SkyZ The sum rule `\sum_{i=1}^{n} i = n(n+1)/2` is [a standard summation rule](https://en.wikipedia.org/wiki/Summation). The remaining sums are simple counts of `1` (or `-1`) over a sum indexing; if you find them difficult to follow, it'll probably easier to grasp them sitting down with paper and pen, trying to follow equality by equality. – dfrib Mar 23 '16 at 23:25