In the depths of some software I'm working on there is a line of code...
double DataNoise = StatsStuff.MeanofSquares() - average * average;
Example numbers:
StatsStuff.MeanofSquares() = 1.9739125181231402E-13
average = -4.3328988592605794E-07
DataNoise = 9.6511265664977283E-15 //(State1)
DataNoise = 9.6511265664977204E-15 //(State2)
If I relaunch the analysis from the GUI repeatedly, sooner or later the result of this calculation changes, sometimes on the first re-run of analysis, but it will usually give a few consistent results before switching to a different answer (the number of times before the switch is significantly variable). Once the software has switched to returning this second value, it never reverts to returning the first.
I'm using C# and Visual Studio, testing on a Windows 7 machine with an i5 4570 if that helps anyone.
I have seen the problem in both Debug and Release builds.
Each time the analysis is launched all the analysis objects are recreated within the analysis method, so there shouldn't be anything persisting.
I've logged the values going into the calculation and they do not change; I have also used BitConverter.GetBytes()
to check the numbers are the same.
I've already seen the question below and many other articles like it online, but they all relate to the differences between two different machines.
Why does this floating point calculation give different results...
The answers in how-deterministic-is-floating-point-inaccuracy seem to suggest that I should be able to expect deterministic behaviour from a single machine and instruction set, yet I don't.
Any help explaining why this happens and/or how to ensure a consistent result would be greatly appreciated.
Some additional byte values from debugging:
Inputs:
average: 48, 51, 51, 18, 221, 19, 157, 190
MeanOfSquares: 205, 250, 200, 243, 196, 199, 75, 61
Outputs:
DataNoise (state 1): 192, 220, 244, 228, 126, 187, 5, 61
DataNoise (state 2): 187, 220, 244, 228, 126, 187, 5, 61