00000100000001000001000100
11111100000000000000000000
Each binary number's length is 26
, and I expected the result to be 100000000000000000000
. I'm confused why the result is 0.
const a = 00000100000001000001000100;
const b = 11111100000000000000000000;
console.log(a & b)
Is it because JS treats 11111100000000000000000000
as a signed number?