Can anybody explain:
why double bitwise NOT for Infinity
returns 0
?~~Infinity //return 0
what happens under the hood?
what is the binary representation of Infinity in javascript?
Can anybody explain:
why double bitwise NOT for Infinity returns 0
?
~~Infinity //return 0
what happens under the hood?
what is the binary representation of Infinity in javascript?
Because you are not operating on the underlying bit pattern of a number in JavaScript.
You cannot do the equivalent of the following C code in JavaScript:
#include <inttypes.h>
#include <math.h>
#include <stdint.h>
#include <stdio.h>
int main(void) {
double x = HUGE_VAL;
uint64_t y = *((uint64_t *) &x);
printf("%016" PRIx64 "\n", y);
printf("%016" PRIx64 "\n", ~y);
printf("%016" PRIx64 "\n", ~~y);
return 0;
}
This prints:
7ff0000000000000
800fffffffffffff
7ff0000000000000
As MDN notes:
A bitwise operator treats their[sic] operands as a set of 32 bits (zeros and ones), rather than as decimal, hexadecimal, or octal numbers. ... Bitwise operators perform their operations on such binary representations, but they return standard JavaScript numerical values.
... values with the most significant (left-most) bit set to 1 represent negative numbers (two's-complement representation).
According to 11.4.8 in ES5, we have:
11.4.8 Bitwise NOT Operator ( ~ )
The production
UnaryExpression : ~ UnaryExpression
is evaluated as follows:
- Let
expr
be the result of evaluatingUnaryExpression
.- Let
oldValue
beToInt32(GetValue(expr))
.- Return the result of applying bitwise complement to oldValue. The result is a signed 32-bit integer.
ToInt32(Infinity)
is +0
. The first ~
makes it 0xffffffff
. The second ~
flips all bits to zero.
That is, it does the equivalent of the following C code:
#include <inttypes.h>
#include <math.h>
#include <stdio.h>
int main(void) {
double x = HUGE_VAL;
uint32_t y = x;
printf("%08X\n", y);
printf("%08X\n", ~y);
printf("%08X\n", ~~y);
return 0;
}
Output:
00000000
FFFFFFFF
00000000