2

Swift floating point arithmetic seems broken compared to C (and therefore Objective-C).

Let's take a simple example. In C:

double output = 90/100.0; // Gives 0.9
float output = 90/100.0f; // Gives 0.9

In Swift:

var output = Double(90)/Double(100.0) // Gives 0.90000000000000002
var output = Float(90)/Float(100.0) // Gives 0.899999976

What's going on? Is this a bug or am I missing something?

EDIT:

#import <iostream>

int main() {
    double inter = 90/100.0;
    std::cout << inter << std::endl; // Outputs 0.9
    return 0;
}
Gadzair
  • 1,221
  • 14
  • 21

1 Answers1

4

The issue is simply the different number of digits being printed out.

#include <iostream>
#include <iomanip>

int main() {
    double d = 90.0 / 100.0;
    float f = 90.0f / 100.0f;
    std::cout << d  << ' ' << f << '\n';
    std::cout << std::setprecision(20) << d << ' ' << f << '\n';
}

0.9 0.9
0.9000000000000000222 0.89999997615814208984

(I wrote this example in C++, but you will get the same results in every language that uses the hardware's floating point arithmetic and allows this formatting.)

If you want to understand why finite precision floating point math does not give you exact results then:

What Every Computer Scientist Should Know About Floating-Point Arithmetic

And:

Float

bames53
  • 86,085
  • 15
  • 179
  • 244