1

I have written this piece of code as an exercise. It counts the bits used to store integer in one's system.

The code here worked fine and gives correct output when I define unsigned int x, but when I define just int x it does not produce any result at all. The program compiles without any error or warnings but does not display anything on the terminal when declaring x as just int

int int_size ()
{
    int x = ~0;
    char bit_count = 0;
    while ( x != 0 )
    {
        x >>= 1;
        bit_count++;
    }
    return bit_count;
}

It works fine with the unsigned int x declaration. I am just curious to know why it behaves wierdly when using the signed int declaration.

F.Y.I - I m using CODE::BLOCKS IDE with the gcc compiler.

  • Have you tried to use debugger? Have you tried to add traces? Just print `x` in the loop to understand what happens! – Ilya Nov 16 '16 at 05:51

0 Answers0