Possible Duplicate:
Big O when adding together different routines
What does O(n) + O(log(n))
reduce to? My guess is O(n)
but can not give a rigorous reasoning.
I understand O(n) + O(1)
should reduce to O(n)
since O(1)
is just a constant.
Possible Duplicate:
Big O when adding together different routines
What does O(n) + O(log(n))
reduce to? My guess is O(n)
but can not give a rigorous reasoning.
I understand O(n) + O(1)
should reduce to O(n)
since O(1)
is just a constant.
Well since O( f(n) ) + O( g(n) ) = O ( f(n) + g(n) )
We are simply trying to calculate an f(n)
such that f(n) > n + log(n)
Since as n grows sufficiently log(n) < n
we can say that f(n) > 2n > n + log(n)
Therefore O(f(n)) = O(2n) = O(n)
In a more general sense, O( f(n) ) + O( g(n) ) = O( f(n) )
if c*f(n)>g(n)
for some constant c. Why? Because in this case f(n)
will "dominate" our algorithm and dictate its time complexity.
Order is always reduced to higher order terms. I can give you intuitive reasoning. Suppose you have O(n + n^2)
. Then which part would play more important role in run time? n or n^2. Obviously n^2. Because where there n^2 you won't notice effect of n when n is increased or decreased.
As example,
let n = 100, then n^2 = 10000
means n is 0.99% and n^2 is 99.01% of total running time.
What would you consider for runtime?
if n is increased then this difference is clearer.
I think you understand now,
the answer is O(n). O(log n) is less than O(n). so their addition sums the maximum value that is O(n).