I wrote a program that computes the largest power of 2 in a given input number. For instance, the largest power of 2 in the number 26 is 16, since 24 is 16. Here is the algorithm:
uint power2( uint n )
{
uint i = n;
uint j = i & (i - 1);
while( j != 0 )
{
i = j;
j = i & (i - 1);
}
return i;
}
I've struggled a bit with analysis of algorithms. I know we are trying to figure out the Big-Oh, Big-Omega, or Big-Theta notation. In order to analyze an algorithm, we are supposed to count the number of basic operations? My problem here is that I see two lines that could be basic operations. I see the line uint j = i & (i - 1)
outside the while loop and I also see the j = i & (i - 1)
inside the while loop. I feel like the one inside the while loop is a basic operation for sure, but what about the one outside the while loop?
The other part I struggle with is determining how many times the body of the while
loop will execute. For instance, if we have a for loop for(int i = 0; i < n; i++) {...}
we know that this loop will execute n
times. Or even in a while loop while(i < n * n) {... i++}
we know this loop will run n * n
times. But for this while loop, it varies depending on the input. For instance, if the number you pass into it is a power of 2 right off the bat, the while loop will never execute. But, if you pass in a really large number, it'll execute many times. I don't know how many times it will execute to be quite honest. That's what I am trying to figure out. Can anyone help me understand what is going on in this algorithm and the number of times it runs, like O(n)
or O(n^2)
, etc?