Possible Duplicate:
Unsequenced value computations (a.k.a sequence points)
Undefined Behavior and Sequence Points
Operator Precedence vs Order of Evaluation
I'm still trying to wrap my head around how the following expression results in undefined behavior:
a = a++;
Upon searching SO about this, I found the following question:
Difference between sequence points and operator precedence? 0_o
I read through all the answers but I still am having difficulty with the details. One of the answers describes the behavior of my above code example as ambiguous, in terms of how a
is modified. For example, it could come down to either of these:
a=(a+1);a++;
a++;a=a;
What exactly makes a
's modification ambiguous? Does this have to do with CPU instructions on different platforms, and how the optimizer can take advantage of the undefined behavior? In other words, it seems undefined because of the generated assembler?
I don't see a reason for the compiler to use a=(a+1);a++;
, it just looks quirky and doesn't make much sense. What would possess the compiler to make it behave this way?
EDIT:
Just to be clear, I do understand what is happening, I just don't understand how it can be undefined when there are rules on operator precedence (which essentially defines the order of evaluation of the expression). Assignment happens last in this case, so a++
needs to be evaluated first, to determine the value to assign to a
. So what I expect is that a
is modified first, during the post-fix increment, but then yields a value to assign back to a
(second modification). But the rules for operator precedence seem to make the behavior very clear to me, I fail to find where there is any "wiggle-room" for it to have undefined behavior.