-2

To start off I found this stackoverflow question that references the time complexity to be O(n^2), but it doesn't answer the question of why O(n^2) is the time complexity but instead asks for an example of such an algorithm. From my understanding an algorithm that runs 1+2+3+...+n times would be less than O(n^2). For example, take this function

function(n: number) {
    let sum = 0;

    for(let i = 0; i < n; i++) {
        for(let j = 0; j < i+1; j++) {
            sum += 1;
        }
    }

    return sum;
}

Here are some input and return values

num sum
1 1
2 3
3 6
4 10
5 15
6 21
7 28

From this table you can see that this algorithm runs in less than O(n^2) but more than O(n). I also realize than algorithm that runs 1+(1+2)+(1+2+3)+...+(1+2+3+...+n) is true O(n^2) time complexity. For the algorithm stated in the problem, do we just say it runs in O(n^2) because it runs more than O(log n) times?

Mason Clark
  • 201
  • 2
  • 7
  • 4
    Where do you get O(log n) from? That is even less than O(n)... See [Wikipedia on 1+2+3+4+](https://en.wikipedia.org/wiki/1_%2B_2_%2B_3_%2B_4_%2B_%E2%8B%AF) – trincot Jul 01 '22 at 06:01
  • @trincot The question isn't about how to calculate 1+2+3+4..., that's simple mathematics. The question is about the time complexity of an algorithm that runs n(n-1)/2 times. But I agree about the O(log n), that was a mistake on my part, it was more a comparison and doesn't have a lot of weight with the original question. – Mason Clark Jul 01 '22 at 06:07
  • The thing is that we use the same mathematics when it comes to asymptotic complexity. Maybe that was your question? – trincot Jul 01 '22 at 06:47
  • 2
    1 + 2 + 3 + ... +n = n*(n+1)/2= (n²+n)/2. O( (n²+n)/2) = O(n²+n) = O(n²)+O(n) = O(n²) – MrSmith42 Jul 01 '22 at 07:30
  • For the visually inclined, instead of a square, it's a triangle. It's still `n^2` because it's a shape in 2-dimensions _vs_ `n` at infinity. – Neil Jul 01 '22 at 18:49

2 Answers2

3

It's known that 1 + 2 + ... + n has a short form of n * (n + 1) / 2. Even if you didn't know that, you have to consider that, when i gets to n, the inner loop runs at most n times. So you have exactly n times (for outer loop i), each running at most n times (for inner loop j), so the O(n^2) becomes more apparent.

I agree that the complexity would be exactly n^2 if the inner loop also ran from 0 to n, so you have your reasons to think that a loop i from 0 to n and another loop j from 0 to i has to perform better and that's true, but with big Oh notation you're actually measuring the degree of algorithm's complexity, not the exact number of operations.

p.s. O(log n) is usually achieved when you split the main problem into sub-problems.

Silviu Burcea
  • 5,103
  • 1
  • 29
  • 43
  • In 1 + 1 + 1 + ... + 1 + n, the inner loop runs at most n times. Even if there are n copies of "1 +", this is still only 2 n = O(n). – Alex Reinking Jul 01 '22 at 06:09
  • Thank you for your answer. I think it clears it up a little bit. I think the problem stemmed from my misunderstanding of what Big O notation is actually about. I can see how you get to O(n^2) from the inner loop running "at most" n times. – Mason Clark Jul 01 '22 at 06:11
  • @MasonClark - my comment is meant to show how that exact line of reasoning is wrong even when one iteration actually does run n times. – Alex Reinking Jul 01 '22 at 06:12
  • @AlexReinking Can the time complexity be O(n) when the algorithm always runs more than n times? – Mason Clark Jul 01 '22 at 06:16
  • @MasonClark - Yes. If it always runs 100*n times, it is still O(n) – Alex Reinking Jul 01 '22 at 06:16
  • @AlexReinking But it doesn't run in O(c*n) because c is not a constant, c grows as n grows – Mason Clark Jul 01 '22 at 06:23
  • @AlexReinking it is not `2n`. You have one iteration for inner loop when `i` is 0, you have TWO iterations for inner loop when `i` is 1, you have `n` iterations when `i` is `n - 1`. There are at most `n` copies of `1 + ...`, as you said, but not twice, there are `n` such sequences. – Silviu Burcea Jul 01 '22 at 06:43
  • I downvoted because the statement is too loose and it has caused confusion. The claim is that a loop which runs `n` times and which executes a statement fewer than `n` times in each iteration is `O(n^2)`. This is _technically correct_, but it is very, very loose. The example I gave of a hypothetical loop that executes a statement once on every iteration until the last is `O(n)`, which is asymptotically better than `O(n^2)`. – Alex Reinking Jul 01 '22 at 06:52
  • I have even explained that OP has reasons to believe that the algorithm presented is a bit faster than having 2 loops each running from 1 to n, but at the end of the day, it is still `O(n^2)`. It's true that the inner loop has fewer steps than `n`, but, on average, it still runs `(n + 1) / 2` times, so inner loop has `O(n)` complexity. – Silviu Burcea Jul 01 '22 at 06:59
1

I think you should interpret the table differently. The O(N^2) complexity says that if you double the input N, the runtime should quadruple (take 4 times as long). In this case, the function(n: number) returns a number mirroring its runtime. I use f(N) as a short for it.

So say N goes from 1 to 2, which means the input has doubled (2/1 = 2). The runtime then has gone from f(1) to f(2), which means it has increased f(2)/f(1) = 3/1 = 3 times. That is not 4 times, but the Big-O complexity measure is asymptotic, dealing with the situation where N approaches infinity. If we test another input doubling from the table, we have f(6)/f(3) = 21/6 = 3.5. It is already closer to 4.

Let us now stray outside the table and try more doublings with bigger N. For example we have f(200)/f(100) = 20100/5050 = 3.980 and f(5000)/f(2500) = 12502500/3126250 = 3.999. The trend is clear. As N approaches infinity, a doubled input tends toward a quadrupled runtime. And that is the hallmark of O(N^2).

  • I upvoted you, I liked your approach to be honest :) It's worth noting that the ratio indeed approaches `4`. Sum of first `n` numbers is `n * (n + 1) / 2`, which is `(n^2 + n) / 2`. Sum of first `2n` number is `2n * (2n + 1) / 2`, which is `(4n^2 + 2n) / 2`. We have limit when `n` goes to infinity for `(4n^2 + 2n) / (n^2 + n)`. Applying L'Hospital rule, the limit is indeed `4`. – Silviu Burcea Jul 02 '22 at 09:51
  • @Silviu Burcea. Thanks. I appreciate it. The approach I suggest uses benchmarking rather than math to get a grip on the Big-O of some code. Of course, both should be in the toolbox but benchmarking is often good enough for everyday programming. Benchmarking complexity has a black-box feel to it. You study how the code behaves rather than how it looks. –  Jul 02 '22 at 17:45