2

How may I find the time complexity of the following code:

(Sorry for adding image, I will re-edit my question once I have access to the laptop)

enter image description here

What I have done so far:

The first loop iterates n times, the second i times and the third log(i*j) times, So after simplifying I got:

Sigma from i=0 to n for i*log i + n * (Sigma from j=0 to i for log i)

But why this is equal to O(n^2 log(n))?

2 Answers2

1

Approach1:

In the worst case:

First loop executes for n time. Time Complexity of first loop = O(n)

Time Complexity of first and second loop = 1 + 2 + 3 + 4 + ... + n = O(n*(n+1)/2) = O(n^2)

We can calculate complexity of 3rd loop and multiply this with the previously calculated complexity because the third loop does not affect (change any variable of first or second loop) the previous loops' variable.

Time Complexity of third loop = log(i*j)

= log(i*j) = log(n*2 ) = 2log(n)

= log(n)

Final complexity = O(n^2 * (log(n))

Approach2:

If we visualize this in terms of log(i*j). It will look something like this.

log(1) + log(2) + log(3) + ... + log(n*(n-1))

=

n*log((n-1)!)

Refer log(n!) value using string approximation - stackoverflow for below result

log((n-1)!) = nlogn

Final complexity:

O(n* nlog(n))

=

O(n^2)*log(n)
Kaushal Kumar
  • 1,237
  • 1
  • 12
  • 21
  • Why the one before the last line is log(n) how did you get that? summation of logs in the log of their multiplication and 1*2*...*n^2 isn't n for sure! –  Jul 23 '20 at 22:08
  • You also got the correct answer by chance. If you are already summing over all `log(i*j)` terms then you shouldn't also multiply the result by how many of them there are. Additionally as OP correctly pointed out, `log(1) + log(2) + log(3) + ... + log(n*(n-1))` does *not* sum to `O(log(n))`; instead it sums to `log([n*(n-1)]!)` (factorial) which can then be expanded to `O(n^2 * log(n))` with [Stirling's approximation](https://en.wikipedia.org/wiki/Stirling%27s_approximation). – meowgoesthedog Jul 24 '20 at 11:27
  • This is the exact same approach adopted by a previous answer (which also arrived at the right answer by chance), and is incorrect because it cannot be applied generally, since the loops are not independent. Your original idea to sum each `log(i*j)` term was correct, but the math was wrong. – meowgoesthedog Jul 24 '20 at 11:38
1

If we look at the outermost 2 loops we observe that there are 1 + 2 + 3 ... + n - 1 iterations. Using standard facts about series summation (or induction if you want an in depth proof), we can see that this O(n^2).

The innermost loop is log(n). The easiest way to prove this is with the master theorem. You can write a recurrence in the form:

LoopIterations(n) = LoopIterations(n/2) + 1

Where LoopIterations is the number of iterations from a starting point n. The master theorem tells us that LoopIterations(n) is O(log(n)).

One subtlety is that our initial condition for the recurrence is LoopIterations(n^2), but since log(n^2) = 2 log(n) this doesn't affect the final computational complexity, since we can ignore constant factors.

@Kaushal Covered why you can multiply the above two results to get O(n^2 log(n)), though I would add that if you want to convince yourself of that you can do induction for these kind of problems though that can get quite long.

PiRocks
  • 1,708
  • 2
  • 18
  • 29