1
 int n = 500;
    for(int i = 0; i < n; i++)
        for(int j = 0; j < i; j++)
            sum++;

My guess is this is simply a O(N^2), but the j < i is giving me doubts.

int n = 500;
     for(int i = 0; i < n; i++)
        for(int j = 0; j < i*i; j++)
            sum++;

Seems like an O(N^3)

  int n = 500;
     for(int i = 0; i < n; i++)
        for(int j = 0; j < i*i; j++)
            if( j % i == 0 )
               for( k = 0; k < j; k++ )
                  sum++

O(N^5)?

So for each loop j has a different value. If it was j < n*n, it would've been more straight forward, but this one is a tricky one, so please help. Thanks.

btrballin
  • 1,420
  • 3
  • 25
  • 41
  • Keep in mind that the O notation is just scaling behavior. Just ask yourself, if the size of input change, how do the number of operations scale accordingly - and you have it. – NoDataDumpNoContribution Feb 07 '15 at 22:27
  • I've voted to close this question. There's hundreds of similar questions already and they're all too narrow to be useful in general. The question I've said is a duplicate provides tools to allow people to compute big-O themselves. – Paul Hankin Feb 08 '15 at 03:28

3 Answers3

3

In the first case sum++ executes 0 + 1 + ... + n-1 times. If you apply arithmetic progression formula, you'll get n (n-1) / 2, which is O(n^2).

In the second case we'll have 0 + 1 + 4 + 9 + ... + (n-1)^2, which is sum of squares of first n-1 numbers, and there's a formula for it: (n-1) n (2n-1)

The last one is interesting. You can see, actually, that the most nested for loop is called only when j is a multiplicand of i, so you can rewrite the program as follows:

int n = 500;
for(int i = 0; i < n; i++) {
   for(int m = 0; m < i; m++) {
       int j = m * i;
       for( k = 0; k < j; k++)
             sum++
   }
}

It's easier to work with math notation:

Derivation

The formula is derived from the code by analysis: we can see that sum++ gets called j times in the innermost loop, which is called i times, which is called n times. In the end, the problem boils down to a sum of cubes of first n numbers plus lower-order terms (which do not affect the asymptotics)

One final note: it looks obvious, but I'd like to show that in general sum of first N natural numbers in dth power is Ω(N^(d+1)) (see Wikipedia for Big-Omega notation), that is it grows no slower than that function. You can apply the same reasoning to prove that a stronger condition is satisfied, namely, it belongs to Θ(N^(d+1)), which combines both Ω and O.

Derivation

Community
  • 1
  • 1
Artem Sobolev
  • 5,891
  • 1
  • 22
  • 40
  • Hi. Appreciate your lengthy answer, but I couldn't quite understand what you were trying to say exactly for the very last problem. I did not see you conclude with an answer for it so I'm a little confused. – btrballin Feb 08 '15 at 02:00
  • @btrballin, you mean very last problem of yours, or my derivation of sum of `d`th powers? The conclusion for the first one is that the running time is `O(n^4)`, rather than `O(n^5)`, and the final note is a "proof" that if you have a loop `for k in [1..n]` which body has `O(k^d)` running time, then the loop will run in `O(n^(d+1))`. The reason I included this derivation is that in the first 3 examples I used the fact that sum of natural numbers, sum of squares, sum of cubes — all have a closed formula of higher order. The last paragraph generalizes these results. – Artem Sobolev Feb 08 '15 at 08:03
1

You are right for all but the last one, which has a tighter bound of O(n^4): note that the last for loop is only executed if j is a multiple of i. There are x / i multiples of i lower than or equal to x, and i * i / i = i. So the last loop is only executed for i values out of the i * i.

Note that big-oh gives an upper bound, so i*i vs n*n makes little difference. Strictly speaking, saying they are all O(n^2015) is also correct (because that is a valid upper bound), but it's hardly helpful, so in practice a tight bound is usually used.

IVlad
  • 43,099
  • 13
  • 111
  • 179
0

IVlad already gave the correct answer.

I think what confuses you is the "Big Oh" definition.

  • N^2 has O(N^2)
  • 1/2N^2 has O(N^2)
  • 1/2N^2 + c*N + b also has O(N^2) - by given c and b are constants independent from N

Check Big Oh definition from here

finxxi
  • 28
  • 5