O(n^k)
is faster than O(n^k')
for all k, k' >= 0
and k' > k
O(n^2)
would be faster than O(n^2*logn)
Note that you can only ignore constants, nothing involving the input size can be ignored.
Thus, complexity of T(n)=n^2 + n^2logn
would be the worse of the two, which is O(n^2logn)
.
Little-oh
Little oh in loose terms is a guaranteed upper bound. Yes, it is called little, and it is more restrictive.
n^2 = O(n^k)
for k >= 2 but n^2 = o(n^k)
for k > 2
Practically, it is Big-Oh
which takes most of the limelight.
What about
T(n)= n^2.001 + n^2logn
?
We have n2.001 = n2*n0.001 and n2 * log(n).
To settle the question, we need to figure out what would eventually be bigger, n0.001 or log(n).
It turns out that a function of the form nk with k > 0
will eventually take over log(n)
for a sufficiently large n
.
Same is the case here, and thus T(n) = O
(n2.001)
.
Practically though, log(n)
will be larger than n0.001.
(103300)0.001 < log(103300) (1995.6 < 3300)
, and the sufficiently large n in this case would be just around 103650, an astronomical number.
Worth mentioning again, 103650. There are 1082 atoms in the universe.