I'm trying to calculate the time complexity of a recursive algorithm and I think I've almost got it. Here's the psuedocode I've been looking at:
long pow( long x, int n ) {
if (n == 0)
return 1;
if (n == 1)
return x;
if(isEven(n))
return pow(x, n / 2 ) * pow(x, n / 2);
else
return x * pow(x * x, n / 2);
}
isEven merely determines whether or not the integer passed to it is even or not, and for the point of this example, operates in constant time.
So, if n = 0 or n = 1, it operates it has constant time operation, like this: f(n) = C0. However, when n > 1, it should operate like so: f(n)= f(n-1) + f(n-1) + C1 when n is even and f(n)= f(n-1) + 1 when n is odd, correct? Or should it be: f(n)= f(n/2) + f(n/2) + C1 when n is even and f(n)= f(n/2) + 1 when n is odd?
I've been looking at a lot of examples. Here is one I've found very helpful. My problem stems from there being two recursive calls when n is even. I'm not fully sure what to do here. If anyone could point me in the right direction I'd really appreciate it.