3

Why am I getting 0 when subtracting 5.0 from 650.50 using the subtracting() method?

in the following code, adding, multiplying and dividing work fine, but why subtracting doesn't? What am I doing wrong?

See code in IBM's sandbox: http://swift.sandbox.bluemix.net/#/repl/59b1387696a0602d6cb19201

import Foundation

let num1:NSDecimalNumber = 650.50
let num2:NSDecimalNumber = 5.0 

let result = num1.adding(num2)
let result2 = num1.subtracting(num2)
let result3 = num1.multiplying(by: num2)
let result4 = num1.dividing(by: num2)

print("Addition: \(result)") // Addition: 655.5
// Why am I getting 0 here and not 645.5?
print("Subtraction: \(result2)") //Subtraction: 0
print("Multiplication: \(result3)") //Multiplication: 3252.5
print("Division: \(result4)") //Division: 130.1

Apples Docs: https://developer.apple.com/documentation/foundation/nsdecimalnumber

fs_tigre
  • 10,650
  • 13
  • 73
  • 146

2 Answers2

2

This may be because of a specificity of the IBM sandbox related to NSDecimalNumber (indeed many parts of Foundation are still not entirely available on Linux).

Anyway, whatever the bug is, a solution is to use the Swift counterpart to NSDecimalNumber which is Decimal.

Despite the fact that this is supposed to be only a wrapper around NSDecimalNumber, it gives the correct result, even on the IBM platform.

Note that this wrapper doesn't use the NSDecimalNumber methods, it uses Swift operators such as + or *.

import Foundation

let num1: Decimal = 650.50
let num2: Decimal = 5.0

let result = num1 + num2
let result2 = num1 - num2
let result3 = num1 * num2
let result4 = num1 / num2

print("Addition: \(result)")
print("Subtraction: \(result2)")
print("Multiplication: \(result3)")
print("Division: \(result4)")

Gives:

Addition: 655.5
Subtraction: 645.5
Multiplication: 3252.5
Division: 130.1

Eric Aya
  • 69,473
  • 35
  • 181
  • 253
  • Quick question, is `Decimal` considered the same as `NSDecimalNumber`? In other words, is `Decimal` recommended currency? Thanks a lot! – fs_tigre Sep 07 '17 at 13:04
  • 1
    `Decimal` is supposed to be the same type as `NSDecimalNumber` but converted to Swift. It is truly intended for money calculations (decadic, high precision arithmetic). However, there are some methods missing compared to `NSDecimalNumber`. – Sulthan Sep 07 '17 at 13:05
  • 1
    @fs_tigre The documentation states that the two are bridged, so I'd say that yes, Decimal is supposed to have the same behavior. – Eric Aya Sep 07 '17 at 13:06
  • Cool, `Decimal` may be my best choice since you can use the native Swift operators (+, - etc.). I will look into what methods will I be missing by using `Decimal`. Thanks a lot. – fs_tigre Sep 07 '17 at 13:14
  • 1
    I believe that Decimal actually corresponds to NSDecimal. NSDecimalNumber is a wrapper around NSDecimal. In either case, it's the preferred way of dealing with currency. See this answer for more info. https://stackoverflow.com/a/41264395/3203487 – David Berry Sep 07 '17 at 14:41
2

Your code isn't wrong and works correctly in Xcode/macOS. However, IBM's Swift sandbox uses Linux, and the implementation of Foundation on Linux has issues. In the Status page of the repo, NSDecimalNumber is marked as "Unimplemented". Therefore, it may have some problems. Use classes from the standard library instead.

Papershine
  • 4,995
  • 2
  • 24
  • 48