0

00000100000001000001000100

11111100000000000000000000

Each binary number's length is 26, and I expected the result to be 100000000000000000000. I'm confused why the result is 0.

const a = 00000100000001000001000100;
const b = 11111100000000000000000000;
console.log(a & b)

Is it because JS treats 11111100000000000000000000 as a signed number?

Nick is tired
  • 6,860
  • 20
  • 39
  • 51
Vagrant Coder
  • 39
  • 1
  • 4
  • 8
    `11111100000000000000000000` is not binary, it's a very large decimal number. – CertainPerformance Nov 17 '21 at 00:04
  • @ CertainPerformance How can I let JS know it's a binary number? Actually, this number is from a string. The length of it is `26` is because I use each bit to represent a letter. `1` means the letter appears at least once in a string. Then, I used `&` to see if there is no common letters in both strings. – Vagrant Coder Nov 17 '21 at 00:09
  • 2
    See the linked canonical. – CertainPerformance Nov 17 '21 at 00:10
  • 2
    Binary number starts with `0b` just like hex numbers starts with `0x`. If your number is a string (that is if it is `"1001"` instead of `1001`) then you can use `parseInt()` to parse it: `parseInt(your_binary_string, 2)` <-- note that you tell `parseInt` to treat the string as a base 2 number ("binary" is another name for base 2 numbers) – slebetman Nov 17 '21 at 02:06

0 Answers0