#include <stdio.h>
int main(void) {
int i=1;
printf("%d %d\n", i+=1, i);
printf("%d %d\n", i, i+=1);
return 0;
}
The above code shows different output in Linux and Mac. In Linux: (gcc version 4.9.2 20150212 (Red Hat 4.9.2-6))
2 2
3 3
In Mac: (Apple LLVM version 7.0.0 (clang-700.0.57.2))
2 2
2 3
In Linux, I tried with c11, c99, c89. But the output is same. So I don't think it's a C standard Issue.
- Why such behaviour?
- Which one is correct and why?