1
#include <stdio.h>
int main(void) {
        int i=1;
        printf("%d %d\n", i+=1, i);
        printf("%d %d\n", i, i+=1);
        return 0;
}

The above code shows different output in Linux and Mac. In Linux: (gcc version 4.9.2 20150212 (Red Hat 4.9.2-6))

2 2
3 3

In Mac: (Apple LLVM version 7.0.0 (clang-700.0.57.2))

2 2
2 3

In Linux, I tried with c11, c99, c89. But the output is same. So I don't think it's a C standard Issue.

  1. Why such behaviour?
  2. Which one is correct and why?
RatDon
  • 3,403
  • 8
  • 43
  • 85
  • 6
    1. undefined behavior 2. both are correct because anything can happen for undefined behavior – MikeCAT Nov 23 '15 at 08:09
  • Correct or incorrect according to what standard? – David Schwartz Nov 23 '15 at 08:14
  • @MikeCAT As far as I know, both outputs are by GCC 4.9.2. Just the date is different. Linux's feb-2015 and Apple's sep-2015. So is it possible for gcc to output different in different operating systems? – RatDon Nov 23 '15 at 08:16
  • 1
    @RatDon It can output differently based on anything -- OS, phase of the moon, CPU, optimization levels, randomly, anything it wants. – David Schwartz Nov 23 '15 at 08:17
  • @DavidSchwartz In the Linux's output, in both the printf, it first evaluates the values and then prints. Shouldn't it follow a kind of order like right to left or left to right? or that too undefined? – RatDon Nov 23 '15 at 08:22
  • 1
    @RatDon Why should it follow a kind of order? Every rule you impose on the compiler takes away flexibility that it could use to make the program more efficient. Why are you so convinced that the benefits of a forced order outweigh the costs? – David Schwartz Nov 23 '15 at 08:24
  • Thanks. All the examples I saw kind of followed a order but differently in different machines. So thought it should follow kind of a order. But anyhow it doesn't matter as it's undefined. – RatDon Nov 23 '15 at 08:29

3 Answers3

2

This order of computation is not guaranteed by the C reference. That means that every compiler is free to do it in whatever order it sees fit. For that reason (and for the sake of readability) you should avoid writing in this manner.

Shloim
  • 5,281
  • 21
  • 36
2

The order in which those statements are evaluated are defined by the implementation, and not guaranteed in the C standard. This makes this code undefined behavior. This means anything that happens is valid and also irrelevant because its literally undefined.

Magisch
  • 7,312
  • 9
  • 36
  • 52
1

Parameters are evaluated in an undefined order, so the i+=1 can occur before or after the i.

This is undefined, because the order of elements on the stack can be different, causing some compilers which have one order of calling convention to be slower than another.

It is an acceptable standard, because the alternative (separating the code as below, leads to better readability (no uncertainty).

 int i = 1, tmp1, tmp2;
 i+= 1;
 tmp1 = i;
 tmp2 = i;
 printf("%d %d\n", tmp1, tmp2 );
 tmp1 = i;
 i += 1;
 tmp2 = i;
 printf("%d %d\n", tmp1, tmp2 );
mksteve
  • 12,614
  • 3
  • 28
  • 50