When we say a method has the time complexity of O(n^2)
is it meant in the same way as in 10^2 = 100
or does it mean that the method is at max or closest to that notation? I am really confused on how to undertand Big O. I remember something called upper bound, would that mean at max?
-
Upperbound can be thought of as a max. Its a way to look at the worst case running time complexity. – Mitch Nov 10 '18 at 16:01
-
Why not look it up in your favourite [source of information](https://en.wikipedia.org/wiki/Big_O_notation)? – n. m. could be an AI Nov 10 '18 at 16:08
-
2I already looked there, and didn't get it. Wikipedia is actually my least favourite source, since I find that they complicate things very much. – Nov 10 '18 at 16:10
-
3Actually, the Wikipedia explanation is pretty decent if you carefully read through it and take a look at the illustrations. Let me write an answer summarizing what Wikipedia explains. – Zabuzard Nov 10 '18 at 16:39
-
@LasseVågsætherKarlsen All of the notions are *worst case* in a sense that they must fulfill the `forall` operator in the definition. So a single *bad* execution forces the complexity class already. Big-O means that the function isn't *worse* (wrt to asymptotic complexity). Big-Omega means it isn't *better* than. Theta means its in the same complexity class. – Zabuzard Nov 10 '18 at 16:54
-
@LasseVågsætherKarlsen A function being in `O(n^2)` does *not* necessarily mean that it quadruples when the input doubles. `f(n) = n` is still in `O(n^2)`. And even if we were talking Theta, `f(n) = n+1` does not (exactly) quadruple when the input doubles, but is in `Theta(n^2)`. And no, "upper bound" does not mean "worst case". Whether a function represents the best-case, worst-case or average execution time of an algorithm (or something else entirely) is completely independent of whether you use O (upper bound), Omega (lower bound) or Theta (lower and upper bound). – sepp2k Nov 10 '18 at 16:59
-
Their explanation is more than accessible to an average student. If you don't get it, you *badly* need to brush up your maths. – n. m. could be an AI Nov 10 '18 at 19:54
-
@Zabuza Big-O isn't the same as worst-case - linking those two probably leads to more confusion than anything else. [How do O and Ω relate to worst and best case?](https://cs.stackexchange.com/q/23068) – Bernhard Barker Nov 10 '18 at 20:11
-
Related / duplicate: [What is a plain English explanation of "Big O" notation?](https://stackoverflow.com/q/487258) See also: [What is the difference between Θ(n) and O(n)?](https://stackoverflow.com/q/471199) – Bernhard Barker Nov 10 '18 at 20:14
2 Answers
If means that the running time is bounded above by N².
More precisely, T(N) < C.N², where C is some constant, and the inequality is true as of a certain N*.
For example, 2N²+4N+6 = O(N²), because 2N²+4N+6 < 3N² for all N>5.
Explanation
If a method f
is inside O(g)
, with g
being another function, it means that at some point (exist some n_0
such that for all n > n_0
) the function f
will always output a smaller value than g
for that point. However, g
is allowed to have an arbitrary constant k
. So f(n) <= k * g(n)
for all n
above some n_0
. So f
is allowed to first be bigger, if it then starts to be smaller and keeps being smaller.
We say f
is asymptotically bounded by g
. Asymptotically means that we do not care how f
behaves in the beginning. Only what it will do when approaching infinity. So we discard all inputs below n_0
.
Illustration
An illustration would be this:
The blue function is k * g
with some constant k
, the red one is f
. We see that f
is greater at first, but then, starting at x_0
, it will always be smaller than k * g
. Thus f in O(g)
.
Definition
Mathematically, this can be expressed by
which is the usual definition of Big-O. From the explanation above, the definition should be clear. It says that from a certain n_0
on, the function f
must be smaller than k * g
for all inputs. k
is allowed to be some constant.
Both images are taken from Wikipedia.
Examples
Here are a couple of examples to familiarize with the definition:
n
is inO(n)
(trivially)n
is inO(n^2)
(trivially)5n
is inO(n^2)
(starting fromn_0 = 5
)25n^2
is inO(n^2)
(takingk = 25
or greater)2n^2 + 4n + 6
is inO(n^2)
(takek = 3
, starting fromn_0 = 5
)
Notes
Actually,O(g)
is a set in the mathematical sense. It contains all functions with the above mentioned property (which are asymptotically bounded by g
).
So, although some authors write f = O(g)
, it is actually wrong and should be f in O(g)
.
There are also other, similar, sets, which only differ in the direction of the bound:
- Big-O: less equals
<=
- Small-o: less
<
- Big-Omega: greater equals
>=
- Small-omega: greater
>
- Theta: Big-O and Big-Omega at the same time (equals)

- 25,064
- 8
- 58
- 82