5

Not sure how to word the title correctly... but what I am wondering is if there is some clever format specifier that will take the number 4.5 and give me @"4.5" but also take the number 2 and give me @"2".

Using the %.1f specifier gives me @"4.5" but also @"2.0". I am trying to get rid of the ".0" bit.

Does such a beast exist, or am I going to have to do some math on this? FWIW, I am trying to iterate over an array of values ranging from 0 to 5 increasing in half-steps, so 0, 0.5, 1, 1.5, ..., 4.5, 5

Cheers!

Raconteur
  • 1,381
  • 1
  • 15
  • 29

2 Answers2

9

NSNumberFormatter is a good choice here. You can configure it to not show the fractional digits if the number is an integer. For example:

NSArray *numbers = @[@0, @0.5, @1.0, @1.5, @2.0, @2.5];
NSNumberFormatter *numberFormatter = [[NSNumberFormatter alloc] init];
numberFormatter.alwaysShowsDecimalSeparator = NO;
numberFormatter.minimumFractionDigits = 0;
numberFormatter.maximumFractionDigits = 1;
numberFormatter.minimumIntegerDigits = 1;
for (NSNumber *number in numbers) {
    NSLog(@"%@", [numberFormatter stringFromNumber:number]);
}

Output:

>> 0
>> 0.5
>> 1
>> 1.5
>> 2
>> 2.5
Andrew Madsen
  • 21,309
  • 5
  • 56
  • 97
0

This is even easier (Swift):

let num1: Double = 5
let num2: Double = 5.52
let numberFormatter = NSNumberFormatter()
numberFormatter.numberStyle = .DecimalStyle
print(numberFormatter.stringFromNumber(NSNumber(double: num1)))
print(numberFormatter.stringFromNumber(NSNumber(double: num2)))

This will print 5 and then 5.52.

Alan Scarpa
  • 3,352
  • 4
  • 26
  • 48