According to this related question's answer, you can try doing:
// Usage for Output 1 — 1.23
[NSString stringWithFormat:@"%@ — %@", [self stringWithFloat:1],
[self stringWithFloat:1.234];
// Checks if it's an int and if not displays 2 decimals.
+ (NSString*)stringWithFloat:(CGFloat)_float
{
NSString *format = (NSInteger)_float == _float ? @"%.0f" : @"%.2f";
return [NSString stringWithFormat:format, _float];
}
In Swift, you can use the "%g
" ("use the shortest representation") format specifier:
import UIKit
let firstFloat : Float = 1.0
let secondFloat : Float = 1.2345
var outputString = NSString(format: "first: %0.2g second: %0.2g", firstFloat, secondFloat)
print("\(outputString)")
comes back with:
"first: 1 second: 1.2"