1

I'm trying to prove that c2n = o((loglog n)n) (That's little-o) for any constant c. I understand that we can prove one function grows at a smaller rate than the other by taking the limit as n approaches infinity, and I can very easily pick some arbitrary integer value for c and show that indeed ((loglog n)n) grows at a faster rate. But how do I prove this to be true for any constant c?

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
Dillon Drobena
  • 761
  • 1
  • 8
  • 26
  • 1
    This question appears to be off-topic because it is about math, not programming. Try math.stackexchange.com. – Barmar Sep 02 '14 at 00:21
  • 3
    This question appears to be off-topic because it is about CS theory, rather than programming. It might be a better fit on cs.stackexchange.com . – Jim Lewis Sep 02 '14 at 00:24

1 Answers1

2

You want to prove that for any choice of c, the value of

limn → ∞ (c2n / (log log n)n) = 0

Notice that for any choice of c that

limn → ∞ (c2n / (log log n)n)

= limn → ∞ (c2 / log log n)n

This limit is 0 because once log log n > c2, you're raising a value less than one to the power of n, which will quickly drop to zero.

Hope this helps!

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
  • To be pedantic, I think we just need to prove that the limit is less than 1 rather than equal to 0, although proving that it's equal to 0 has proven that it's less than 1. – wookie919 Sep 02 '14 at 01:02
  • @wookie919 From what I recall, it's not sufficient to prove that the limit is less than one. For example, the limit of n / 2n as n tends to infinity is 1 / 2, but n != o(2n). – templatetypedef Sep 02 '14 at 01:06
  • Ah. Good point. Thanks. – wookie919 Sep 02 '14 at 01:07