When I do something like this:
var x = 5;
console.log( x + (x += 10) ); //(B) LOGS 10, X == 20
console.log( (x += 10) + x ); //(A) LOGS 0, X == 30
The difference in the returned value between (A) and (B) is explained by the value of x
at the time it becomes evaluated. I figure that backstage something like this should happen:
TIME ---->
(A) (x = 5) + (x += 10 = 15) = 20
(B) (x += 10 == 15) + (x == 15) = 30
But this only holds true if and only if x
is evaluated in the same left-to-right order that it was written.
So, I have a few questions about this,
Is this guaranteed to be true for all Javascript implementations?
Is it defined to be this way by the standard?
Or, is this some kind of undefined behavior in Javascript world?
Finally, the same idea could be applied to function calls,
var x = 5;
console.log(x += 5, x += 5, x += 5, x += 5); // LOGS 10, 15, 20, 25
They also appear to be evaluated in order, but is there a stronger guarantee that this should always happen?