0

I have the following algorithm and the runtime complexity is O(N^2) but I want to have a deeper understanding of it rather than just memorizing common runtimes.

What would be the right approach to break it down and analyze it with i+1 in the inner for loop taken into account?

void printunorderedPairs(int[] array) {
    for(int i=0; i<array.length; i++) {
        for(int j=i+1; j<array.length; j++) {
            System.out.println(array[i] + "," + array[j]);
        }
    }
}

EDIT

Asking for how to analyze a specific question

Jo Ko
  • 7,225
  • 15
  • 62
  • 120
  • Possible duplicate of [Big O, how do you calculate/approximate it?](http://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it) – Paul Hankin Dec 31 '16 at 14:16

2 Answers2

1

What would be the right approach to break it down and analyze it

Take pencil and paper and put down some loops unwraped:

     i        inner loops per i
-------------------------------
     1               length - 1  
     2               length - 2
    ..                       ..  
     k               length - k 
    ..                       ..
length - 1                    1
length                        0

Now, in order to obtain the total time required, let's sum up the inner loops:

 (length - 1) + (length - 2) + ... + (length - k) ... + 1 + 0

It's an arithmetic progression, and its sum is

 ((length - 1) + 0) / 2 * length == length**2 / 2 - length / 2 = O(length**2)
Dmitry Bychenko
  • 180,369
  • 20
  • 160
  • 215
  • Thank you for the response. Just a few questions. For inner loops per i, why is it length-1, length-2... 0? Shouldn't it be i+1? And how did you derive that it's an arithmetic progression? Lastly, what does ** mean? – Jo Ko Dec 31 '16 at 20:22
  • @Jo Ko: since you *start* from `i + 1` you have `Length - i - 1` to *loop* over, and we count loops, not starting indexes. `**` is a usual sign for *power* (which came to stay from good old Fortran), e.g `10 ** 3` (ten raised into 3d power) is `1000`. To see the arithmetic progression, just put down some more items: `length - 1, length - 2, length - 3, ..., length - k, ...5, 4, 3, 2, 1`. In case you want a *proof*, use *induction* – Dmitry Bychenko Dec 31 '16 at 20:38
  • How come it is Length - i - 1? Shouldn't it be array.length-1 then, like in length - 1, length - 2, length - 3? – Jo Ko Dec 31 '16 at 20:43
  • @Jo Ko: well, in my answer I've removed `array.` prefix and put it as `length` (there's only one array and I thought it couldn't bring any confusion). So you're right: strictly speaking it should be `array.length-1, array.length-2, ... ,array.length-k,..3, 2, 1, 0` – Dmitry Bychenko Dec 31 '16 at 20:48
0

Let T = the number of times the inner loop runs.

About half the time, when i<array.length/2, it runs at least array.length/2 times. So, for about array.length/2 outer iterations, the inner loop runs at least array.length/2 times, therefore:

T >= (N/2)*(N/2)
i.e.,
T >= N²/4

This is in O(N²). Also, though, for all array.length outer iterations, the inner loop runs at most array.length times, so:

T <= N*N, i.e.,
T <= N²

This is also in O(N²). Since we have upper and lower bounds on the time that are both in O(N²), we know that T is in O(N²).

NOTE: Technically we only need to upper bound to show that T is in O(N²), but we're usually looking for bounds as tight as we can get. T is actually in Ө(N²).

NOTE ALSO: there's nothing special about using halves above -- any constant proportion will do. These are the general rules in play:

  1. Lower bound: If you do at least Ω(N) work at least Ω(N) times, you are doing Ω(N²) work. Ω(N)*Ω(N) = Ω(N²)

  2. Upper bound: If you do at most O(N) work at most O(N) times, you are doing O(N²) work. O(N)*O(N) = O(N²)

  3. And since we have both, we can use: Ω(N²) ∩ O(N²) = Ө(N²)

Matt Timmermans
  • 53,709
  • 3
  • 46
  • 87
  • Finally note that "O(N)*O(N) = O(N²)" is technically an abuse of notation since O(...) is a set of functions. Your prof might prefer if you say something like "f(N) ∈ O(N), g(N) ∈ O(N) => f(N)*g(N) ∈ O(N²)" – Matt Timmermans Dec 31 '16 at 19:09
  • I was told that j runs for N- 1 steps the first time, and the second time, it's N- 2 steps. Then N- 3 steps. How so? And how can that equal to 1 + 2 + 3 + ... + N-1? – Jo Ko Dec 31 '16 at 20:33
  • @JoKo, I didn't use a lot of = signs. It runs for >= N/2 steps N/2 times (and then less than that the other N/2 times). It also runs for <= N steps all N times. To get the complexity, you don't need to calculate the exact number of times it runs. – Matt Timmermans Jan 01 '17 at 01:51