0

Currently studying bitwise arithmetic. It's really easy, because I have some CS background. But I just don't understand one moment with this operator.

For example:

variable3 = variableOne & 3;

or

variable3 &= 3;

Actually this doesn't matter.

I don't understand how the process of setting the bits to 0 is going on. And how you can process it on the paper?

Anatoliy Gatt
  • 2,501
  • 3
  • 26
  • 42
  • 2
    And I don't understand why people don't search before asking. I guess we must _both_ have something to learn :-) See http://stackoverflow.com/questions/1746613/bitwise-operation-and-usage/1746642#1746642 – paxdiablo Mar 03 '12 at 09:05

1 Answers1

2

Let’s say 5&3, four-bit width:

0101b = 5dec
0011b = 3dec
------------
0001b = 1dec

You just & the bits in the same column. And since the & operator only returns 1 when both arguments are 1, the higher bits from 5 not present in 3 are masked out.


As for your example from the comments:

$ perl -E 'printf "%b\n", 0x76'
1110110

And now:

1110110 = 0x76
0000011 = 3dec
-------
0000010 = 2dec

…and just to validate:

$ perl -E 'say 0x76&3'
2

The schema is simple, you just & each column:

x
y
-
z

Where z is x&y.


Aha, judging by your comments in the neighbouring answer the problem is elsewhere. Numeric variables do not contain “hexadecimal values” in them. Numeric variables contain a bit pattern representing a number. “A number” is never binary, decimal or hexadecimal. When you say “three”, there’s no number system in play, three is a three no matter what.

When you say something like var x = 0x76 in the source code, the machine reads the hexadecimal representation of the number, creates a bit pattern representing this number and stores it in the memory corresponding to the variable. And when you then say something like x &= 3, the machine creates a bit pattern representing number three, combines that with the bit pattern stored in the variable and stores the result in the variable.

zoul
  • 102,279
  • 44
  • 260
  • 354