0

I have a recurrence relation which is like the following:

T(n) = 2T(n/2) + nlog(n)

I am using recursion tree method to solve this. And at the end, i came up with the following equation:

n( log(n/2^0) + log(n/2^1) + log(n/2^2) + .......+ log(n/2^(log n)))*

on solving this equation, i got time complexity as n(log n)^2 but by using master's theorem i got time complexity as n(log(log n)) please help me in finding my mistake.

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
Anish
  • 9
  • 1
  • 3

3 Answers3

3

Let's take the equation you have here:

n(log(n/20) + log(n/21) + log(n/22) + ... + log(n/2log n))

Now, let's use properties of logarithms to rewrite log(n / 2k) as log n - log 2k. This gives us

n((log n - log 20) + (log n - log 21) + (log n - log 22) + ... + log n - (log 2log n))

Another property of logarithms tells us that log 2k = k log 2. The log base isn't specified here, so for simplicity I'm going to assume they're base-2 logs. That means that log 2k = k. That means that we have this expression:

n((log n - 0) + (log n - 1) + (log n - 2) + ... + (log n - log n))

= n(log n + (log n - 1) + (log n - 2) + ... + 2 + 1 + 0).

You might recognize this inner sum as Gauss's sum: 0 + 1 + 2 + 3 + ... + k = k(k+1) / 2. That means that this sum simplifies down to

n (log n)(1 + log n) / 2

= Θ(n log2 n).

Which matches what the Master Theorem says. That's great news!

Dharman
  • 30,962
  • 25
  • 85
  • 135
templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
0

You're wrong. You will git n log^2(n) = n * log(n) * log(n) as well (the seocnd case). See more details here (case 2a).

We know that a = b = 2 and it means c_crit = 1. Hence as f(n) = n log(n) = Theta(n log(n)), and it means k = 1 > -1.

OmG
  • 18,337
  • 10
  • 57
  • 90
-1

Master Theorem is not applicable for this recurrence because the cost of merging, which is n * Log(n) in your recurrence, has to be a power of n.

However, if you take n * Log(n) to be upper bound by n2, then you can apply Master Theorem. It will yield a complexity of n2 Log(n) in that case.

Further, as we have taken a looser bound on the cost of merging, therefore this complexity is looser as well. Your complexity of n * Log2(n) could be more correct/tighter bound.

displayName
  • 13,888
  • 8
  • 60
  • 75
  • There is a version of the Master Theorem that does allow for log terms in the additive part of the expression, and you can apply that here. However, depending on which version of the Master Theorem the OP is learning about, that tool might not be available here. – templatetypedef Sep 03 '20 at 22:42