0

Im using the code in This Link to generate a uint whose bit are all set to 1 in a desired range of indeces.

eg:

#include <iostream>
#include <string>

int main ()
{
    // range is [32,5] so i want [1 1 1......1 0 0 0 0 0]
    uint mask = ((1 << 32) - 1) ^ ((1 << (5 - 1)) - 1);
    std::cout << mask << std::endl;
}

The output is 4294967280, which contains the desired array of bits according to this site.

But as soon as i do the following:

#include <iostream>
#include <string>

int main () 
{
    uint stop = 32;
    uint start = 5;
    uint mask = ((1 << stop) - 1) ^ ((1 << (start - 1)) - 1);
    std::cout << mask << std::endl; 
}

It outputs 15. Uuuuh what's is going on? are the 32 and 5 in the first example a different type that gets casted when assigning to the uint mask? how do i fix this?

user2255757
  • 756
  • 1
  • 6
  • 24

0 Answers0