let decimalA: Decimal = 3.24
let decimalB: Double = 3.24
let decimalC: Decimal = 3.0 + 0.2 + 0.04
print (decimalA) // Prints 3.240000000000000512
print (decimalB) // Prints 3.24
print (decimalC) // Prints 3.24
I'm totally confused. Why do these things happen? I know why floating point numbers lose precision, but I can't understand why Decimal lose precision while storing decimal numbers.
I want to know how can I initialize Decimal type without losing precision. The reason why these happen is also very helpful to me. Sorry for my poor English.