Swift has the same base-10 numerical type as ObjC has had for years: NSDecimalNumber.
One nice change in Swift is that you can, if you choose, add operator overloads to simplify its usage.
import Foundation
public func +(lhs: NSDecimalNumber, rhs: NSDecimalNumber) -> NSDecimalNumber {
return lhs.decimalNumberByAdding(rhs)
}
let g1 = (NSDecimalNumber(double:0.1) + NSDecimalNumber(double:0.2) );
println(g1==NSDecimalNumber(double:0.3)) // true
If you want to have even more certainty of avoiding rounding errors (there's a possible rounding error when you take a value through the implicit double-convertion to get it to NSDecimalNumber(double:)
), you can use the more explicit, integer-based interface that cannot suffer rounding:
let point1 = NSDecimalNumber(mantissa: 1, exponent: -1, isNegative: false);
let point2 = NSDecimalNumber(mantissa: 2, exponent: -1, isNegative: false);
You may be tempted to use NSDecimalNumber(string:)
, and it's useful, but you have to be very careful about locales. You probably want to use systemLocale()
(which is really a "fallback" locale with fixed settings) to avoid comma-vs-dot confusion:
let g2 = NSDecimalNumber(string: "0.1", locale:NSLocale.systemLocale())
But I'd avoid strings here and use the other techniques that are safer.