I've just been testing a program I'm working, and I see that it's executing 3μs faster (a statistically significant change) when I compile it with -g. This makes no sense to me - I thought that the -g flag wasn't supposed to affect the program execution, and that even if it did it would make it run slower, not faster.
Can anyone tell me why this is happening? And whether it changes the programs execution flow? I am not compiling with -O because I need it to execute exactly as written, but if -g can somehow make it run faster with changing the instruction order I should obviously be using that.
So I need to know exactly what changes the -g flag makes to the program.
Edit: The more tests I run, the bigger the t-value gets (= the more statistically significant the difference becomes). This is definitely not measurement error - something is going on.