2

If I run

var num = 23;
var n = num.toString();
console.log(n)

it logs 23 as expected but if I apply the toString() directly to a number like,

var n = 15.toString();
console.log(n)

it throws an error:

Uncaught SyntaxError: Invalid or unexpected token.

I noticed it also works fine for decimal values in the num variables (like .3, .99, 2.12, 99.1) etc. Could some one please help me understand the difference and how this function works

Steve
  • 1,553
  • 2
  • 20
  • 29
Pawan
  • 63
  • 6
  • 1
    This is interesting. I know better than to use the syntax that caused your error, but I couldn't explain why one breaks and the other doesn't! If you were to do console.log('13'.toString()); it works! – Sam W Nov 03 '16 at 19:29
  • Also it works if the number is in parentheses i.e. `(15).toString();` – Jon Nov 03 '16 at 19:31
  • @Sam it's because the parser greedily reads the `.` as part of a *NumericLiteral* token, `23.`, so the period cannot be used as part of a property-access expression. Consider that `23..toString()` does work, because a *NumericLiteral* cannot have two periods, so the excess period is free to be part of the property access expression. – apsillers Nov 03 '16 at 19:53

4 Answers4

9

JavaScript parsing is working as intended. You have declared the following:

23.toString()

We see this as an Integer with a function being called on it.

The parser doesn't. The parser sees an attempt to declare a floating-point literal. The parser uses this:

[(+|-)][digits][.digits][(E|e)[(+|-)]digits]

It assumes that you are declaring a Floating-Point Literal because it is:

  • not a variable, as it doesn't start with a letter or acceptable character
  • is a numeric literal, as it starts with a numeric character
  • is a floating point number, as it has a decimal point in it.

If you really, for all intents and purposes, want to call 23.toString(), the course of action is to isolate the literal like so:

(23).toString(); //interprets 23 as literal

or

23..toString(); //interprets 23. as literal

That being said, JavaScript is flexible enough to know you want to use 23 as a String in most cases. This compiles fine.

var foo = "The answer is " + 42;  

So does this.

var bar = "39" - 0 + 3; //42

This doesn't.

var baz = "39" + 3; //393!!! 
Compass
  • 5,867
  • 4
  • 30
  • 42
3

I believe the JavaScript parsers simply don't allow you to call methods directly on the literals.

However you can do this...

var n = (15).toString();
console.log(n);

... and it will work.

EDIT

Thanks @apsillers, for the explanation. I didn't know that. The first dot on numbers is treated as part of the number, hence the problem. 1.1.toString() works. Interesting.

Andre Pena
  • 56,650
  • 48
  • 196
  • 243
  • 2
    They do allow property access directly on a literal, it's just that the first decimal that appears attached to a number literal is always interpreted as part of the literal. If your literal has a decimal point, you can use an additional dot as a property accessor: `1.1.toString()`, `15..toString()`, etc. – apsillers Nov 03 '16 at 19:42
1

I can't explain why, but if you do

23..toString()

it works.

Also another way to cast to string:

23 + '';

yay Javascript.

Kolby
  • 2,775
  • 3
  • 25
  • 44
1

When you store it into num as 23 that is the value assigned to it. When you call 23.toString() it thinks it is 23(decimal point) and some word tostring which doesn't make sense.

So what you have to do is add another decimal point afterward to let it know that it is 23.0

What you get then is 23.(invisibleZeroHere).toString() AKA 23..toString()

Kevin Johnson
  • 218
  • 1
  • 3
  • 10