66

I need to calculate the time complexity of the following code:

for (i = 1; i <= n; i++)
{
  for(j = 1; j <= i; j++)
  {
   // Some code
  }
}

Is it O(n^2)?

Lii
  • 11,553
  • 8
  • 64
  • 88
yyy
  • 912
  • 1
  • 9
  • 13
  • 3
    Duplicate of http://stackoverflow.com/questions/362059/big-o-of-this-nested-loop – Jay Conrod Feb 09 '09 at 00:23
  • 1
    My question is not exactly a duplicate of the one you linked to but it's a common question so I guess it's being asked in many forms. – yyy Feb 15 '09 at 12:14
  • 2
    Related: [How to find time complexity of an algorithm](https://stackoverflow.com/q/11032015) – Bernhard Barker Jan 12 '18 at 14:42
  • 1
    Does this answer your question? [What is the Big-O of a nested loop, where number of iterations in the inner loop is determined by the current iteration of the outer loop?](https://stackoverflow.com/questions/362059/what-is-the-big-o-of-a-nested-loop-where-number-of-iterations-in-the-inner-loop) – Ashutosh Dec 12 '19 at 18:58

10 Answers10

94

Yes, nested loops are one way to quickly get a big O notation.

Typically (but not always) one loop nested in another will cause O(n²).

Think about it, the inner loop is executed i times, for each value of i. The outer loop is executed n times.

thus you see a pattern of execution like this: 1 + 2 + 3 + 4 + ... + n times

Therefore, we can bound the number of code executions by saying it obviously executes more than n times (lower bound), but in terms of n how many times are we executing the code?

Well, mathematically we can say that it will execute no more than n² times, giving us a worst case scenario and therefore our Big-Oh bound of O(n²). (For more information on how we can mathematically say this look at the Power Series)

Big-Oh doesn't always measure exactly how much work is being done, but usually gives a reliable approximation of worst case scenario.


4 yrs later Edit: Because this post seems to get a fair amount of traffic. I want to more fully explain how we bound the execution to O(n²) using the power series

From the website: 1+2+3+4...+n = (n² + n)/2 = n²/2 + n/2. How, then are we turning this into O(n²)? What we're (basically) saying is that n² >= n²/2 + n/2. Is this true? Let's do some simple algebra.

  • Multiply both sides by 2 to get: 2n² >= n² + n?
  • Expand 2n² to get:n² + n² >= n² + n?
  • Subtract n² from both sides to get: n² >= n?

It should be clear that n² >= n (not strictly greater than, because of the case where n=0 or 1), assuming that n is always an integer.

Actual Big O complexity is slightly different than what I just said, but this is the gist of it. In actuality, Big O complexity asks if there is a constant we can apply to one function such that it's larger than the other, for sufficiently large input (See the wikipedia page)

gsamaras
  • 71,951
  • 46
  • 188
  • 305
AndyG
  • 39,700
  • 8
  • 109
  • 143
  • 2
    May I ask a small question here? what if the "//some code" part is some computation with O(N) complexity, how is the result calculated? I think this is the common case where one function calls another and considers the later as a black-box having some complexity provided by specs? – TSL_ Mar 16 '16 at 15:11
  • 3
    @ShawnLe: Insightful observation. In most assumptions, yes, we assume that `//some code` is O(1), and therefore does not get factored into Big O complexity. If it were in fact O(N), then our overall complexity becomes O(N^3). Think of it as multiplication (because it is). For ~N outer loop iterations, the inner loop iterates ~N times, with each iteration performing ~N work. N times N times N = N^3. – AndyG Mar 16 '16 at 17:14
  • 1
    Dont forget heapifying a heap is two nested loops (as you build the heap), but its O(n). – Duxa Apr 10 '19 at 03:59
  • I would say nested loops are not typcially n^2 since that implies that both loops have the same length. Its more likely thats its n*m. – The Fool Sep 01 '22 at 15:55
93

A quick way to explain this is to visualize it.

if both i and j are from 0 to N, it's easy to see O(N^2)

O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O

in this case, it's:

O
O O
O O O
O O O O
O O O O O
O O O O O O
O O O O O O O
O O O O O O O O

This comes out to be 1/2 of N^2, which is still O(N^2)

Krys Jurgowski
  • 2,871
  • 17
  • 25
  • 3
    Thanks for finally helping me understand how inner loops with a j<=i condition result in 1/2 N^2. This has been bothering me for some time. Can you explain how this is still O(N^2)? – erstaples Feb 08 '15 at 03:20
  • 2
    Due to Moore's law, we can assume that the speed of algorithm execution doubles about every 18 months. Because of this, when analyzing algorithms, we can drop the coefficient and just focus on algorithms in terms of n. Basically O(n^2) will take O(1/2 n^2) in 18 months. As n grows higher though, runtime grows exponentially, where for a linear time algorithm, it grows with n. – Krys Jurgowski Feb 08 '15 at 03:33
  • 1
    So, what you're saying is that when calculating big Oh notation, the 1/2 in 1/2(n^2) is unimportant, due in part to the fact that coefficients become irrelevant over time? The important part of that term - again, in terms of big Oh - is that it's quadratic, not that it is a quadratic that gets halved? – erstaples Feb 08 '15 at 05:11
  • 1
    Yes exactly. Also that the runtime grows exponentially. When your data goes from 100 points to 10,000 to 10,000,000, your algorithm becomes unusable because the runtime can't scale with the input. http://www.cl.cam.ac.uk/~djg11/speedo/run1-plot.png – Krys Jurgowski Feb 08 '15 at 15:52
14

Indeed, it is O(n^2). See also a very similar example with the same runtime here.

Community
  • 1
  • 1
Chris Bunch
  • 87,773
  • 37
  • 126
  • 127
7

Let us trace the number of times each loop executes in each iteration.

for (int i = 1; i <= n; i++){  // outer loop
    for (int j = 1; j <= i; j++){  // inner loop
        // some code
    }
}

In the first iteration of the outer loop (i = 1), the inner loop executes once.

In the second iteration of the outer loop (i = 2), the inner loop executes twice.

In the third iteration of the outer loop (i = 3), the inner loop executes thrice.

So, in the last iteration of the outer loop (i = n), the inner loop executes n times.

Therefore, the total number of times this code executes is

1 + 2 + 3 + … + n

= (n(n + 1) / 2) (Sum of Natural Numbers Formula)

= (((n^2) + n) / 2)

= O(n^2)

——————

Also, do take a look at these

  1. https://stackoverflow.com/a/71805214/17112163
  2. https://stackoverflow.com/a/71537431/17112163
  3. https://stackoverflow.com/a/69821878/17112163
  4. https://stackoverflow.com/a/72046825/17112163
  5. https://stackoverflow.com/a/72046933/17112163
4

On the 1st iteration of the outer loop (i = 1), the inner loop will iterate 1 times On the 2nd iteration of the outer loop (i = 2), the inner loop will iterate 2 time On the 3rd iteration of the outer loop (i = 3), the inner loop will iterate 3 times
.
.
On the FINAL iteration of the outer loop (i = n), the inner loop will iterate n times

So, the total number of times the statements in the inner loop will be executed will be equal to the sum of the integers from 1 to n, which is:

((n)*n) / 2 = (n^2)/2 = O(n^2) times 
user2831683
  • 967
  • 1
  • 10
  • 21
2

Yes, the time complexity of this is O(n^2).

Saykat
  • 124
  • 2
  • 12
2

As other correct answers have shown, the resulting complexity is O(n²). It is primarily O(n²/2) which boils down to O(n²). Here is why the /2 does not matter:

It is important to understand that time complexity does not refer to the speed of an algorithm but the rate at which the speed changes with respect to n. Rate of change.

Let's assume we have two algorithms:

  • A with complexity O(n²)
  • B with complexity O(n²/2)

For input size (n) of 5, you could resolve time complexity like this:

  • For A - O(n²) which is 25
  • For B - O(n²/2) which is 12.5

Now, for the input of size 10, you would resolve complexity like:

  • For A - O(n²) which is 100
  • For B - O(n²/2) which is 50

In either case, doubling the input size quadrupled the time to run. That means for the purpose of time complexity, O(n²/2) is the same as O(n²). As I said, the time complexity is not a measure of how long it takes to run the algorithm but how that time changes with respect to the input size.

Emmanuel
  • 322
  • 2
  • 12
0

I think the easiest way to think about it is like this:

The outer loop runs n times, and for at least n/2 of those iterations, the inner loop runs at least n/2 times. The total number of inner loop iterations is therefore at least n2/4. That's O(n2)

Similarly, the outer loop runs n times, and in every iteration, the inner loop runs at most n times. The total number of inner loop iterations, therefore, is at most n2. That's also in O(n2).

Matt Timmermans
  • 53,709
  • 3
  • 46
  • 87
-1

The inner loop depends on outer loops and the inner loop runs I times which gives me

for n = 5 if i = 1 inner loops runs 1 times 1 = 1

if i = 2 inner loops runs 2 times 1 + 2 = 3

if i = 3 inner loops runs 3 times 1 + 2 + 3 = 6

if i = 4 inner loops runs 4 times 1 + 2 + 3 + 4 = 10

if i = 5 inner loops runs 5 times 1 + 2 + 3 + 4 + 5 = 15

From above, we can know that n (n + 1) / 2

So O(n *(n+1))/2 = O(n2/2 + n/2) = O(n2/2) + O(n/2)

I am not great at algorithm analysis so please feel free to correct my answer.

kta
  • 19,412
  • 7
  • 65
  • 47
-2

First we'll consider loops where the number of iterations of the inner loop is independent of the value of the outer loop's index. For example:

 for (i = 0; i < N; i++) {
     for (j = 0; j < M; j++) {
         sequence of statements
      }
  }

The outer loop executes N times. Every time the outer loop executes, the inner loop executes M times. As a result, the statements in the inner loop execute a total of N * M times. Thus, the total complexity for the two loops is O(N2).

napendra
  • 378
  • 4
  • 12