3

According to the docs here, Swift 3/4 Decimal type is a representation in base 10 bridged to NSDecimalNumber. However I'm having precision issues that do not reproduce when using NSDecimalNumber.

let dec24 = Decimal(integerLiteral: 24)
let dec1 = Decimal(integerLiteral: 1)
let decResult = dec1/dec24*dec24 
// prints 0.99999999999999999999999999999999999984

let dn24 = NSDecimalNumber(value: 24)
let dn1 = NSDecimalNumber(value: 1)
let dnResult = dn1.dividing(by: dn24).multiplying(by: dn24)
// prints 1

Shouldn't the Decimal struct be accurate, or am I misunderstanding something?

Rafael Nobre
  • 5,062
  • 40
  • 40

2 Answers2

7

NSDecimalNumber (and its overlay type Decimal) can represent

... any number that can be expressed as mantissa x 10^exponent where mantissa is a decimal integer up to 38 digits long, and exponent is an integer from –128 through 127.

So decimal fractions (with up to 38 decimal digits) can be represented exactly, but not arbitrary numbers. In particular 1/24 = 0.416666666... has infinitely many decimal digits (a repeating decimal) and cannot be represented exactly as a Decimal.

Also there is no precision difference between Decimal and NSDecimalNumber. That becomes apparent if we print the difference between the actual result and the "theoretical result":

let dec24 = Decimal(integerLiteral: 24)
let dec1 = Decimal(integerLiteral: 1)
let decResult = dec1/dec24*dec24

print(decResult - dec1)
// -0.00000000000000000000000000000000000016


let dn24 = NSDecimalNumber(value: 24)
let dn1 = NSDecimalNumber(value: 1)
let dnResult = dn1.dividing(by: dn24).multiplying(by: dn24)

print(dnResult.subtracting(dn1))
// -0.00000000000000000000000000000000000016
Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382
  • 1
    I think the question here is about why they get what appears to be `0.9999999...` with `Decimal`, but `1` with `NSDecimalNumber`. Pretty sure this is just a difference in how the numbers print. – Itai Ferber Nov 16 '17 at 15:48
  • 2
    @ItaiFerber: OP asked *"Shouldn't the Decimal struct be accurate, ...?"* and that is what I tried to answer. – Martin R Nov 16 '17 at 16:22
  • Yes that was the question (and misconception). It can't accurately represent a repeating decimal – Rafael Nobre Nov 16 '17 at 17:04
4

The problem is simply an artefact of the way Playground formats numbers.

I typed this into Playground

import Foundation

let dn1 = Decimal(integerLiteral: 1)
let dn24 = Decimal(integerLiteral: 24)
let decResult = dn1 / dn24 * dn24
print(decResult)

let nsdn1 = NSDecimalNumber(value: 1)
let nsdn24 = NSDecimalNumber(value: 24)

let nsdecResult = nsdn1.dividing(by: nsdn24).multiplying(by: nsdn24)
print(nsdecResult)

Playground displays the number on the right hand side as 0.99999999999999999999999999999999999984 for the first calculation and 1 for the second calculation. However, both print statements printed 0.99999999999999999999999999999999999984.

Here's a picture to prove it:

enter image description here

Oh, and the reason why the calculation produces 0.99999999999999999999999999999999999984instead of 1 is because (as Martin R says) 1/24 cannot be represented exactly as a Decimal.

JeremyP
  • 84,577
  • 15
  • 123
  • 161