I am having trouble setting a Decimal value to the specific number 571.97 without precision errors. Doubles and Floats keep the precision, but both Decimal and NSDecimalNumber are off by different amounts. Could anyone take a look at this and tell me where I'm going wrong? I reproduced the issue in Playground with this code:
import Foundation
let decimal: Decimal = 571.97 // 571.9700000000001024
let double: Double = 571.97 // 571.97
let float: Float = 571.97 // 571.97
let nsDecimalNumber: NSDecimalNumber = 571.97 // 571.9700000000003
let decimalizedDouble = Decimal(floatLiteral: double) // 571.9700000000001024