Disclaimer
Guys, I DO aware of Why does 10..toString() work, but 10.toString() does not? question existence, but the thing is that it doesn't provide the formal explanation.
The specification's interpretation of the . character in that particular position is that it will be a decimal. This is defined by the numeric literal syntax of ECMAScript.
Without reference to a standard isn't trustable enough
The question body
I subconsciously understand that
42..toString()
is treated by a parser as a 42.
number followed by a .toString()
call.
What I cannot understand is why an interpreter cannot realize that
42.toString()
is a 42
followed by a method call.
Is it just a drawback of modern JS interpreters or is it explicitly stated by ES5.1?
From ES5.1 the Numeric Literal is defined as (only significant part of definition):
NumericLiteral ::
DecimalLiteral
HexIntegerLiteral
DecimalLiteral ::
DecimalIntegerLiteral . DecimalDigits(opt) ExponentPart(opt)
. DecimalDigits ExponentPart(opt)
DecimalIntegerLiteral ExponentPart(opt)
The last rule is what I expect to be chosen by a parser.
UPD: to clarify, this question expects as an answer references to ES specification that state explicitly that interpreter must behave like it does