Suppose I have two functions f1 and f2 that return a double and get the same input (a const reference to some kind of object).
These functions are designed in such a way that given an input x,
f1(x) <= f2(x)
should always hold.
When testing this assertion on a set of 1000 input instances, a small subset of instances fails. What's remarkable is that in all such cases f1(x) is greater than f2(x) by a delta that's less than 10^-13.
The following code sample is sketchy, but it should be enough for demonstration purposes:
const InputInstance x{...};
const double a{f1(x)};
const double b{f2(x)};
assert(a <= b);
In some other file, I have the functions f1 and f2 declared as follows:
const double f1(const InputInstance& x);
const double f2(const InputInstance& x);
The following code
printf("FLT_RADIX = %d\n", FLT_RADIX);
printf("DBL_DIG = %d\n", DBL_DIG);
printf("DBL_MANT_DIG = %d\n", DBL_MANT_DIG);
prints:
FLT_RADIX = 2
DBL_DIG = 15
DBL_MANT_DIG = 53
on my system.
As far as I understand this correctly, I could expect the output doubles to coincide up to the 15th decimal. Right?
Should I avoid using the '<=' operator on doubles? Has the 13th decimal a meaning I'm not aware of, or should I stop complaining and look for a bug in my code ;-) ?