I'm trying to display floats to just one decimal point. I'm getting unexpected results as follows:
Code:
float a = 1.25;
float b = 1.35;
NSLog(@"1.25 -> %.1f\n1.35 -> %.1f",a,b);
Output:
1.25 -> 1.2
1.35 -> 1.4
Expected output, either:
1.25 -> 1.3
1.35 -> 1.4
or:
1.25 -> 1.2
1.35 -> 1.3
Is this simply due to the internal conversion between binary and decimal? If so, how do I get the expected behaviour?
I'm using Xcode 4.6.
edit: Okay, thanks to TonyK and H2CO3 it's due to the binary representation of decimals.
float a = 1.25;
float b = 1.35;
NSLog(@"1.25 -> %.30f\n1.35 -> %.30f",a,b);
1.25 -> 1.250000000000000000000000000000
1.35 -> 1.350000000000000088817841970013
Lots of good info, but as far as I can see no one has approached the second question: How do I get the expected behaviour?
Rounding numbers in Objective-C is a quite different question.