Is the big(O) of the following recursive code simply O(n choose k)?
int nchoosek(int n, int k) {
if (n == k) return 1;
if (k == 0) return 1;
return nchoosek(n-1, k) + nchoosek(n-1, k-1);
}
Is the big(O) of the following recursive code simply O(n choose k)?
int nchoosek(int n, int k) {
if (n == k) return 1;
if (k == 0) return 1;
return nchoosek(n-1, k) + nchoosek(n-1, k-1);
}
I'm not sure if the notation is correct but I guess you can write the recurrence for this function like
T(n) = T(n-1, k) + T(n-1, k-1) + O(1)
Since you have two possible paths we just have to analyse the worst case of each and choose the slowest.
Worst case of T(n-1, k)
Given that 0<k<n
and k
is as far as possible from n
then we have
T(n-1, k) = T(n-1) = O(n)
Worst case of T(n-1, k-1)
We need 0<k<n
and k
should be as close to n
as possible. Then
T(n-1, k-1) = T(n-1) = O(n)
Therefore T(n, k) = 2*O(n) + O(1) = O(n)
Another way of seeing this is by reducing the problem to other known problems, for example you can solve the same problem by using the definition of the choose function in terms of the Factorial function:
nCk = n!/k!(n-k)!
The running time of the Factorial is O(n)
even in the recursive case.
nCk
requires calculating the Factorial three times:
n! => O(n)
k! => O(k) = O(n)
(n-k)! => O(n-k) = O(n)
Then the multiplication and division are both constant time O(1)
, hence the running time is:
T(n) = 3*O(n) + 2*O(1) = O(n)