4

f(n)=(log(n))^log(n)

g(n)= n/log(n)

f = O(g(n))?

Dimitar
  • 4,402
  • 4
  • 31
  • 47
Soup
  • 133
  • 1
  • 2
  • 9
  • 2
    Short answer: Yes. Longer answer, run a profiler on the code. big-o is not usable for actual performance measurements, only for "what happens if N grows towards infinity" type of problems. And what does "O(x)" have to do with the problem? if `f=O(g(n))`, is `n` a constant? If not, why is it not `f(n)=O(g(n))`? Or is `f` related to `f(n)=(log(n))^log(n)`? – Lasse V. Karlsen Apr 30 '10 at 20:46
  • 4
    if this is homework use the tag `homework` – BlueRaja - Danny Pflughoeft Apr 30 '10 at 20:48
  • @see http://stackoverflow.com/questions/2307283/what-does-olog-n-mean-exactly/2307314#2307314 – BalusC Apr 30 '10 at 20:50
  • The Big O notation is not about speed but about limiting behavior. – Gumbo May 04 '10 at 11:09
  • @Gumbo I did not know that O() is equivalent to lim(), either, and always associated it with measure for performance. Is this common notation in English? – devio May 04 '10 at 11:14

5 Answers5

12

Take the log of both sides:

log(f(n)) = log(log n) * log n

log(g(n)) = log(n) - log(log(n)) = log(n)(1 - log(log(n))/log(n))

Clearly log(log(n)) dominates (1 - log(log(n))/log(n)), so g is O(f). f is not O(g). Since it's homework, you may need to fill in the details.

It's also fairly easily to get an idea what the answer should be just by trying it with a large number. 1024 is 2^10, so taking n=1024:

f(n) = 10^10

g(n) = 1024/10.

Obviously that's not a proof, but I think we can see who's winning this race.

Steve Jessop
  • 273,490
  • 39
  • 460
  • 699
5

f(n) grows faster than g(n) if and only if f(en) also grows faster than g(en) since exp is strictly increasing to infinity (prove it yourself).

Now f(en) = nn and g(en) = en / n, and you can quote the known results.

kennytm
  • 510,854
  • 105
  • 1,084
  • 1,005
  • This isn't quite right. 1 - 1/x is strictly increasing, but it doesn't follow that if f grows strictly faster than g, then 1 - 1/f grows strictly faster than 1 - 1/g. They might grow at the same big-O rate (unsurprisingly, since 1 - 1/x converges to a constant). So your result is true, but not for the reasons stated. – Steve Jessop Apr 30 '10 at 21:07
  • @Steve: (1) "If and only if" (2) You should compare f(1 - 1/x), not 1 - 1/f(x). – kennytm Apr 30 '10 at 21:13
  • OK, so with f(n) = n^2, g(n) = n. 1 - 1/n^2, (1 - 1/n)^2, and 1 - 1/n are all O(1), so they don't grow (strictly) faster even though some of them are strictly greater than others (for n > 1). exp is "better" for our purposes than just strictly increasing, and proves a stronger result. – Steve Jessop Apr 30 '10 at 21:22
  • +1, then. You don't even need that exp is strictly increasing, just that it and its inverse both tend to infinity. I think. – Steve Jessop Apr 30 '10 at 21:35
0

If Limit[f[x] / g[x], x -> Infinity] = Infinity, then f[x] grows faster than g[x].

Limit[Log[x] ^ Log[x] / (x / Log[x]), x -> Infinity] = + Infinity

So, Log[x] ^ Log[x] grows faster than x / Log[x]

yassin
  • 6,529
  • 7
  • 34
  • 39
  • If you have Mathematica, you can use Plot[Log[x]^(Log[x])/(x/Log[x]), {x, 0, 100}] to see it. – yassin Apr 30 '10 at 20:51
  • 2
    http://www.wolframalpha.com/input/?i=plot+%28log%28x%29%29%5elog%28x%29+and+x%2Flog%28x%29+from+1+to+100 – kennytm Apr 30 '10 at 20:55
0

Mathematica gives the limit of f(n) / g(n) as n tends towards infinity as infinity, which means that f grows faster. This means that g(n) belongs to (=) O(f(n)).

You can use this for example if you don't have Mathematica.

IVlad
  • 43,099
  • 13
  • 111
  • 179
0

f is vastly bigger. By n^loglog(n) -1 . log n

unkulunkulu
  • 11,576
  • 2
  • 31
  • 49
Fakrudeen
  • 5,778
  • 7
  • 44
  • 70