1

I'm trying to round down Decimal values using NSDecimalRound(.... For the most part it's behaving according to my expectations, but now and again it returns a value that takes me by surprise. In the example below the first calculation does what I would expect, that is, with a scale of 4, rounds 2341.2143 down to 2341.2143 (i.e. the same value). The second calculation is identical except that the value being rounded now is 2341.2142 (note last digit is 2 not 3), this time the returned value is 2341.2141 whereas I would expect a value of 2341.2142. Similar behaviour is found with 2341.2146, which becomes 2341.2145 when rounded down to 4 d.p. Can anyone explain why this happens?

// Working as expected... ////////////
var unrounded: Decimal = 2341.2143
var rounded = Decimal()

NSDecimalRound(&rounded, &unrounded, 4, .down)
print(rounded) // 2.2143

// A bit unexpected (to me at least) /////////
var unrounded: Decimal = 2341.2142
var rounded = Decimal()

NSDecimalRound(&rounded, &unrounded, 4, .down)
print(rounded) // 2.2141
Paul Patterson
  • 6,840
  • 3
  • 42
  • 56
  • I've not come across any documentation warning me to avoid doing this. It's not illogical is it? I mean it's clearly a bit pointless, but there is a right answer isn't there, so it seems reasonable to expect to get it. This code will be used in a function that gets two decimals and rounds the second to match the scale of the first. If I have to then I can add a check to make sure I don't end up in this position, I'd just rather not have to. – Paul Patterson Jun 18 '22 at 21:13
  • 2
    @PaulPatterson If you need to preserve the fraction precision you need to use Decimal string initializer. `Decimal(string: "2341.2142")!` or `Decimal(sign: .plus, exponent: -4, significand: 23412142)` – Leo Dabus Jun 18 '22 at 21:53

1 Answers1

2

The problem comes from the literal which is Double and which has to be converted to a Decimal. So the error comes from the conversion:

var unrounded: Decimal = 2341.2143
//   ^-- decimal var    ^-- Double literal

As you know, the floating point encoding is error prone. You can check here that:

2341.2143 is encoded as 2341.21430000000009385985322296619415283203125  
     rounded down makes 2341.2143
2341.2142 is encoded as 2341.21419999999989158823154866695404052734375 
     rounded down makes 2341.2141 

You need to initialize the Decimal to benefit from the superior precision. Unfortunately, this is not intuitive. You can for example use, the following, which behaves as expected:

var unrounded3: Decimal = Decimal(sign:.plus, exponent:-4, significand:23412142)
var rounded3 = Decimal()
NSDecimalRound(&rounded3, &unrounded3, 4, .down)
print(rounded3)
Christophe
  • 68,716
  • 7
  • 72
  • 138
  • Ok, so actually the same mistake discussed here: https://stackoverflow.com/questions/453691/how-to-use-nsdecimalnumber – matt Jun 18 '22 at 22:06
  • Or here. https://stackoverflow.com/questions/5304855/proper-way-to-instantiate-an-nsdecimalnumber-from-float-or-double – matt Jun 18 '22 at 22:09
  • @matt thanks for these complementary references. It seems the same problem. But the code snippets there are Objective-C, which is less telling to people who started directly with Swift. Moreover, one could have expected that in the 7 to 11 years after these answers, now that Decimal became a first class Swift type, that there would be some better support (it would not be difficult for the compiler to add a suffix to the litteral for parsing directly Decimals). – Christophe Jun 18 '22 at 22:36
  • 1
    On the contrary, it remains full of traps and bugs. – matt Jun 18 '22 at 22:53