Let’s say 5&3
, four-bit width:
0101b = 5dec
0011b = 3dec
------------
0001b = 1dec
You just &
the bits in the same column. And since the &
operator only returns 1
when both arguments are 1
, the higher bits from 5
not present in 3
are masked out.
As for your example from the comments:
$ perl -E 'printf "%b\n", 0x76'
1110110
And now:
1110110 = 0x76
0000011 = 3dec
-------
0000010 = 2dec
…and just to validate:
$ perl -E 'say 0x76&3'
2
The schema is simple, you just &
each column:
x
y
-
z
Where z
is x&y
.
Aha, judging by your comments in the neighbouring answer the problem is elsewhere. Numeric variables do not contain “hexadecimal values” in them. Numeric variables contain a bit pattern representing a number. “A number” is never binary, decimal or hexadecimal. When you say “three”, there’s no number system in play, three is a three no matter what.
When you say something like var x = 0x76
in the source code, the machine reads the hexadecimal representation of the number, creates a bit pattern representing this number and stores it in the memory corresponding to the variable. And when you then say something like x &= 3
, the machine creates a bit pattern representing number three, combines that with the bit pattern stored in the variable and stores the result in the variable.