0

I've been studying Big-O for an upcoming interview and I wanted something explained.

So one way I've been determining the run-time of algorithms is by counting the number of for-loops. For example, if an algorithm uses 1 for-loop I'd count that as O(n) and if it used two for-loops I'd say it was O(n^2). I find that this is generally an easy way to figure out time for simple algorithms. Although in a textbook I am reading they said this algorithm has an O(n^2) run-time.

 a=5
b=6
c=10
for i in range(n):
   for j in range(n):
      x = i * i
      y = j * j
      z = i * j
for k in range(n):
   w = a*k + 45
   v = b*b
d = 33

How would this have an O(n^2) run-time? If ()=3+32+2+1=32+2+4 then are we just discarding the last loop (2n) because it does not add to 3n^2?

Thanks for reading!

  • 2
    `O(n^2)` is correct as it "covers" the case of `O(n^2 + n)`. See https://softwareengineering.stackexchange.com/q/279609/162414 – user2864740 Jan 17 '20 at 20:01
  • 1
    *"determining the run-time of algorithms is by counting the number of for-loops"* This is not a correct way to determine the time complexity of an algorithm. You need to count how many times the loops iterate: when you have nested loops, multiply, but when they are not nested, add. Also, not every loop iterates O(*n*) times; you could have nested loops five levels deep, but if they're all over constant-size ranges, then multiplying five constants still gives you a constant. – kaya3 Jan 17 '20 at 20:06
  • 1
    Does this answer your question? [Big O, how do you calculate/approximate it?](https://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it) – Prune Jan 17 '20 at 20:24

1 Answers1

3

Let's look at what the curves look like:

enter image description here

You can see here what a curve for just N looks like, along with the curve for N^2 and their sum. Notice anything? The N^2+N curve looks a lot more like an N^2 curve than an N curve, adding the N didn't really change it a whole lot. In fact, if we scale the sum curve by a constant (.93 in this case), we can make them look almost identical:

enter image description here

If we're analyzing algorithms, at least theoretically, we're generally worried about the factors that will dominate our processing time. If adding a factor of N to the runtime doesn't change much, why bother thinking about it? We should worry about the quadratic behavior. This isn't necessarily true in real life, where we might really care if something is a constant 2x factor, but at least for theoretical analysis it is.

This is why big-O notation is an asymptotic measure of performance. We only consider what happens as N gets very large. If we think about the ratio N^2/N as N -> infinity, it's obvious that this will trend to zero. So, the larger n gets, the less important the linear factor is, and thus we discard it.

gct
  • 14,100
  • 15
  • 68
  • 107