Swift floating point arithmetic seems broken compared to C (and therefore Objective-C).
Let's take a simple example. In C:
double output = 90/100.0; // Gives 0.9
float output = 90/100.0f; // Gives 0.9
In Swift:
var output = Double(90)/Double(100.0) // Gives 0.90000000000000002
var output = Float(90)/Float(100.0) // Gives 0.899999976
What's going on? Is this a bug or am I missing something?
EDIT:
#import <iostream>
int main() {
double inter = 90/100.0;
std::cout << inter << std::endl; // Outputs 0.9
return 0;
}