I came across a StackOverflow answer that gives the following code to efficiently count the number of bits that are set to 1
in a 32-bit int
:
int NumberOfSetBits(int i)
{
i = i - ((i >> 1) & 0x55555555);
i = (i & 0x33333333) + ((i >> 2) & 0x33333333);
return (((i + (i >> 4)) & 0x0F0F0F0F) * 0x01010101) >> 24;
}
But I had lot of issues in understanding this. I couldn't find a link where it's explained properly. Can anyone help me out here in understanding this piece of code, or provide a link which could be more helpful?