I'm looking for a way to find how often I can divide a constant x by two (and not get a remainder) without using loops, recursion or the logarithm. Since this is the same problem as finding the index of the least significant non-zero bit, I am hoping that there is some way to use bitwise operations to do this. Unfortunately I wasn't able to come up with it. Any ideas?
Background: I have a loop that doubles the counter on every iteration until it no longer divides the constant x. I want this loop to be unrolled, but the NVIDIA CUDA compiler isn't smart enough to figure out the number of iterations, so I want to rewrite the loop in such a way that the number of iterations becomes more obvious to the compiler:
for(i=1; CONST & i == 0; i *= 2)
bla(i);
should become something like
#define ITERATIONS missing_expr_with_CONST
for(i=0; i < ITERATIONS; i++)
fasel(i);