I am have been following a tutorial on bit level operation. the code that i was working on is as follows:
int main(void){
puts("bit-level calculations:");
puts("----------------------");
unsigned int x = 10;
unsigned int y = 1;
unsigned int result;
result = x&y;
printf("x & y = %d\n", result);
result = x|y;
printf("x | y = %d\n", result);
result = x^y;
printf("x ^ y = %d\n", result);
}
the result was as follows:
x & y = 0
x | y = 11
x ^ y = 11
however my problem is with the first answer. what i understood is the 1 & 0 = 0, but 1& 1 = 1, what i was expecting was that i should have received an answer of at least 10 & 1 = 10. Because the first bit is 1 for the x and the first digit is 1 for the y. and the second bit for x is 0 and y bit is 0 so the result should be 0. the question is why did i get only zero where for the or and Xor i received two bits as a result.
thank you very much. i do understand that there are a few questions that was posted regarding the bit level operation, however the answer does not clarify my question.