When playing around with JavaScript syntax it struck me that the following code will throw an error in SpiderMonkey and V8 engines:
var a = 1, b = 1;
a++++b;
This to me is strange, since the following works perfectly fine:
var a = 1, b = 1;
a+++b; // = 2; (Add a and b, then increase a)
// now a == 2 and b == 1
a+++-b; // = 1; (add a and -b, then increase a)
// now a == 3 and b == 1
In addition, the following would be nonsensical code:
var a = 1, b = 1;
a++ ++b; // throws an error
My argument is now that if a+++b
is equivalent to a++ + b
, and not to a+ ++b
, and a+++-b
is equivalent to a++ + -b
, then a++++b
can only be interpreted as a++ + +b
in order for it to be valid JavaScript code.
Instead, the engines insist that a++++b
is interpreted as a++ ++b
, by operator precedence.
This to me is in contrast with the logic that the engines implements using the /
symbol, as explained here, to distinguish between division and regular expressions. An example
var e = 30, f = 3, g = 2;
e/f/g; // == 5
e
/f/g; // == 5
/f/g; // is equivalent to new RegExp("f","g")
Here the argument is that because /f/g
does not make sense as division in the last line, it is interpreted as a regular expression.
Obviously the /
symbol gets a special treatment, in order to distinguish between division and regular expressions. But then why do ++
and --
not get a special treatment as well? (That is, outside operator precedence)
A second question is why operator precedence is not called only when the code is has multiple valid interpretations.