Comparing some C-code and the F# I'm trying to replace it with, I observed that there were some differences in the final result.
Working back up the code, I discovered that even at there were differences - albeit tiny ones.
The code starts by reading in data from a file. and the very first number comes out differently. For instance, in F# (easier to script):
let a = 71.9497985840
printfn "%.20f" a
I get the expected (to me) output 71.94979858400000000000
.
But in C:
a = 71.9497985840;
fprintf (stderr, "%.20f\n", a);
prints out 71.94979858400000700000
.
Where does that 7 come from?
The difference is only tiny, but it bothers me because I don't know why. (It also bothers me because it makes it more difficult to track down where my two versions of code are diverging)