I thought the whole point of the Decimal
type was arbitrary precision arithmetic. (Or rather I thought that Decimal supported arbitrary precision as well as supporting base-10 arithmetic.)
However, an example I ran into while looking at somebody's question lead me to believe that it does have a limit. Consider this code:
let string = "728509129536673284379577474947011174006"
if var decimal = Decimal(string: string) {
print("string \n'\(string) as a Decimal = \n'\(String(describing:decimal))'")
} else {
print("error converting '\(string)' to a decimal")
}
That outputs
string
'728509129536673284379577474947011174006 as a Decimal =
'728509129536673284379577474947011174000'
It looks like the last digit gets lost. I tried various other values, and they all start showing zeros at that last digit (It looks like the low-order digit gets truncated to zero when the value contains 39 decimal digits.)
Is that documented somewhere?