-1

1 / f * f should mathematically equal 1. I tested this identity with the code below and it was not true for f = 41.000000 and f = 47.000000.

I think it's related with floating point or rounding but don't know the reason. What makes this result?

#include <stdio.h>

int main(void) {
    float f;

    for (f = 1; f < 50; f += 1) {
        if (1 / f * f != 1)
            printf("f=%f\n", f);
    }
    return 0;
}
Woong
  • 21
  • 2

1 Answers1

2

It is indeed related to floating point rounding. If you were working in real number arithmetic, for non-zero f, 1/f * f == 1 would always be true. In float arithmetic, it is usually but not always true.

Taking the case of 41.0, the real arithmetic value of 1/f is 0.024390243902439024390243902439.... The rounded-to-float result of the division is 0.024390242993831634521484375, which is enough smaller than the real arithmetic result that the result of the multiplication is less than 1.

Floating point arithmetic is designed to be a close approximation to real arithmetic, while also allowing efficient implementation. It is not surprising that it achieves the real number invariant for many cases, just not quite all.

Patricia Shanahan
  • 25,849
  • 4
  • 38
  • 75