Why is it in JavaScript that the following results in false:
10 === 000000010 (false)
But this results in true:
010 === 000000010 (true)
In all cases the left and right are both 10, they should all result in true shouldn't they?
Why is it in JavaScript that the following results in false:
10 === 000000010 (false)
But this results in true:
010 === 000000010 (true)
In all cases the left and right are both 10, they should all result in true shouldn't they?
JavaScript numbers beginning with leading 0
s followed by any of the digits 01234567
are octal (base 8) rather than in decimal (base 10).
You can see this in an example like this:
10 === 010 // false
8 === 010 // true
Note that if there is an 8
or 9
digit, it is not a valid octal number and thus will be interpreted as a decimal number:
89 === 089 // true
Note that octal literals don't work in strict mode:
(function(){ "use strict"; return 010 === 10; })()
// SyntaxError: Octal literals are not allowed in strict mode.
This is described in section B.1.1 of the JavaScript specification as non-normative behavior for compatibility with older versions of ECMAScript. An octal integer literal is defined as follows:
OctalIntegerLiteral ::
0 OctalDigit
OctalIntegerLiteral OctalDigit
OctalDigit :: one of
0 1 2 3 4 5 6 7
Your current example not-withstanding, numbers prefixed with a 0
that only contain the digits 0-7 are interpreted as octal. A better example would be
123 == 0123 // nope
because 0123
in base 10 is 83
.
To bring it inline with your updated example
parseInt(10, 10) // 10
parseInt(000000010, 10) // 8
parseInt(010, 10) // 8