The result of evaluating your expression is unspecified, since the evaluations of x
and fun(&x, &y)
are indeterminately sequenced.
Notice, that an expression evaluation being allowed to yield an unspecified result does not mean that your program has undefined behavior; rather, it means that you cannot tell which of a finite set of possible results will be produced by the evaluation of that expression.
Interpreting the Standard can be quite tricky here (and in fact, in the original answer I interpreted it the wrong way - thanks to Jerry Coffin for noticing).
To see why this can be tricky, let's consider the first part of paragraph 1.9/15 of the C++11 Standard:
Except where noted, evaluations of operands of individual operators and of subexpressions of individual
expressions are unsequenced. [...] The value computations of the operands of an
operator are sequenced before the value computation of the result of the operator. If a side effect on a scalar
object is unsequenced relative to either another side effect on the same scalar object or a value computation
using the value of the same scalar object, the behavior is undefined.
For example, given an expression which consists in computing the sum of two sub-expressions, such as (considering operator +
is not overloaded here):
e1 + e2
Where the evaluation of e1
uses the value of a certain scalar s
, and the evaluation of e2
has side-effects on that value, the behavior is undefined. For instance, the evaluation of:
(i + i++)
Gives undefined behavior because of the sentence quoted above.
However, your situation is different. The same paragraph later on specifies:
When calling a function (whether or not the function is inline), [...] Every evaluation in the calling function (including other function calls) that is not otherwise specifically
sequenced before or after the execution of the body of the called function is indeterminately sequenced with
respect to the execution of the called function.
This means that if the expression to be evaluated looks like this:
i + f(i)
And f(i)
increments i
, the behavior is no more undefined, but only unspecified. In other words, your compiler is allowed to evaluate i
and f(i)
in any order, but the possible outcome is just the result of one of these possible evaluations.
If the behavior were undefined, on the other hand, the possible outcome would have been anything (ranging from crashing, to behaving as you expected, to printing weird messages on the console, and so on and so on).