I do not see why 3 & 0x1111 = 1 ? It seems that:
for any unsigned 32-bit integer i, i & 0x1111 should be i, right?
However when I tried this on ubuntu 14.04, I got 3 & 0x1111=1
. Why?
int main() {
unsigned int a =3;
printf("size of a= %lu\n",sizeof(a));
printf("value of 3 & 0x1111= %d\n",a & 0x1111);
return 0;
}