4

So the answer to this question What is the difference between Θ(n) and O(n)?

states that "Basically when we say an algorithm is of O(n), it's also O(n2), O(n1000000), O(2n), ... but a Θ(n) algorithm is not Θ(n2)."

I understand Big O to represent upper bound or worst case with that I don't understand how O(n) is also O(n2) and the other cases worse than O(n).

Perhaps I have some fundamental misunderstandings. Please help me understand this as I have been struggling for a while.

Thanks.

Community
  • 1
  • 1
Yadrif
  • 169
  • 2
  • 11
  • 8
    If a number is `< 1`, it is also `< 10` and `< 1000`, but if it is `==1`, is is not also `==10` or `==1000`. – tobias_k Aug 31 '15 at 13:56
  • An upper bound needn't be tight. The same way that 0.99 < 1 but also 0.99 < 100000000000, you can say n=O(n) but also n=O(n³). BigO is an upper bound. BigΘ is more like an equality. –  Aug 31 '15 at 13:58
  • Big Oh notation is a ballpark. An algorithm is Big-Oh(n) if it's "in the ballpark of n". n^2 is a bigger ballpark, etc. – AndyG Aug 31 '15 at 13:59
  • If something can be done in no more than `n` time then it follows that it can also be done in `n^2` time because `n < n^2`. This is why O(n) algorithms are also O(n^2) – shuttle87 Aug 31 '15 at 14:00

4 Answers4

3

It's helpful to think of what big-Oh means: if a function is O(n), then c*n, where c is some positive number, is the upper-bound. If c*n is an upper-bound, it's clear that for integers, c*n^2 would also be an upper-bound. Also c*n^3, c*n^4, c*n^1000, etc.

The below graph shows the growth of functions, which are upper bounds of the function "to the right" of it; i.e., it grows faster on smaller n.

enter image description here

erip
  • 16,374
  • 11
  • 66
  • 121
1

Suppose the running time of your algorithm is T(n) = 3n + 6 (i.e., an arbitrary polynomial of order 1).

It's true that T(n) = O(n) because 3n + 6 < 4n for all n > 5 (to use the definition of big-oh notation). It's also true that T(n) = O(n^2) because 3n + 6 < n^2 for all n > 5 (to use the defintion again).

It's also true that T(n) = Θ(n) because, in addition to the proof that it was O(n), it is true that 3n + 6 > n for all n > 1. However, you cannot prove that 3n + 6 > c n^2 for any value of c for arbitrarily large n. (Proof sketch: lim (cn^2 - 3n - 6) > 0 as n -> infinity).

chepner
  • 497,756
  • 71
  • 530
  • 681
0

I understand Big O to represent upper bound or worst case with that I don't understand how O(n) is also O(n2) and the other cases worse than O(n).

Intuitively, an "upper bound of x" means that something will always be less than or equal to x. If something is less than or equal to x, it is also less than or equal to x^2 and x^1000, for large enough values of x. So x^2 and x^1000 can also be upper bounds.

This is what Big-oh represents: upper bounds.

IVlad
  • 43,099
  • 13
  • 111
  • 179
0

When we say that f(n) = O(g(n)), we mean only that for all sufficiently large n, there exists a constant c such that f(n) <= cg(n). Note that if f(n) = O(g(n)), we can always choose a function h(n) bigger than g(n) and since g(n) is eventually less than h(n), we have f(n) <= cg(n) <= ch(n), so f(n) = O(h(n)) as well.

Note that the O bound is not tight. The theta bound is the intersection of O(g(n)) and Omega(g(n)), where Omega gives the lower bound (it's like O, the upper bound, but bounds from below instead). If f(n) is bounded below by g(n), and h(n) is bigger than g(n), then if follows that f(n) is not (necessarily) bounded below by h(n).

Patrick87
  • 27,682
  • 3
  • 38
  • 73