1

When typing these three statements in a JavaScript-Console (for example node.js)

console.log(5 & 0x5555555555555);
console.log(5 & 0x55555555555555);
console.log(5 & 0x555555555555555);

...then I get the following answers:

> console.log(5 & 0x5555555555555);
5
undefined
> console.log(5 & 0x55555555555555);
4
undefined
> console.log(5 & 0x555555555555555);
0
undefined

This seems to be very spooky: The first hexadecimal value has 13 5-digits, the last has 15 5-digits. Every three outputs should be "5" but the second and third are wrong.

Of course I want to make bit-manipulation with 64bit values (which can have up to 16 hexadecimal digits). Can anyone tell me, what my mistake is in this situation?

Christian
  • 576
  • 1
  • 4
  • 16
  • 2
    The literal `0x55555555555555` doesn't define that number, it defines the number `0x55555555555554` instead, because you've exceeded the range in which JavaScript's `number` type (the standard IEEE-754 double-precision binary floating point) can accurately represent all whole numbers. Similarly, `0x555555555555555` defines the number `555555555555540`. – T.J. Crowder Nov 16 '21 at 17:03
  • Okay, thank you (T.J.Crowder) for your explanation! – Christian Nov 16 '21 at 17:05
  • 2
    If you use [`BigInt`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt), you'll get the values you expect: `5n & 0x5555555555555n` => `5n`, `5n & 0x55555555555555n` => `5n`, and `5n & 0x555555555555555n` => `5n`. That's because `BigInt` doesn't have the range limitation that `number` has (instead, it's less efficient), and doesn't convert to 32-bit ints when doing bitwise operations like `number` does. – T.J. Crowder Nov 16 '21 at 17:06

0 Answers0