If a function f(n)
grows more slowly than a function g(n)
, why is f(n) = O(g(n))
?
e.g. if f(n)
is 4n^4
and g(n)
is log(4n^n^4)
My book says f=O(g(n))
because g=n^4*log(4n)=n^4(logn + log4)=O(n^4*logn)
. I understand why g=O(n^4*logn)
, but I'm not sure how they reached the conclusion that f=O(g(n))
from big O
of g
.
I understand that f(n)
grows more slowly than g(n)
just by thinking about the graphs, but I'm having trouble understanding asymptotic behavior and why f=O(g(n))
in general.