Recently I found out that this is completely valid:
1..toString(); //"1"
To me it looks a little bit weird in the first place, but with further explorations I figured out that it works is because the browser treats the first 1.
as a number.
This leads us to the question. If I call .toString
on a number:
1.toString(); //SyntaxError
it won't work. However, if I do:
1.0.toString(); /*or*/ 1..toString(); //"1"
it works.
Why is there a difference between 1
and 1.0
? I thought there is no number types in JavaScript? Why would the decimal point matter?