1

I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.

        print(myObjectValues.v) // 33.48
        let mydata = try JSONEncoder().encode(myObjectValues)
        let string = String(data: mydata, encoding: .utf8)!
        print(string) // 33.47999999999999488

Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.

I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.

Joe
  • 173
  • 1
  • 15
  • 1
    This isn't a Decimal value. It's a Double value because that's what JSONEncoder works with. See https://stackoverflow.com/questions/588004/is-floating-point-math-broken If you're looking for a JSON encoder that can work directly with Decimal values (and you actually have Decimal values), you'll need something custom. See https://github.com/rnapier/RNJSON for an in-development example of how you'd approach this. – Rob Napier Jun 24 '21 at 21:22
  • thank you and yes, i know its a double value and i'm converting this double to decimal value. and even before encoding it i've tried to convert it string value. but all returns same. I've not faced any such issues working on android. Is this some kind of bug with JSONEncoder in swift? – Joe Jun 24 '21 at 21:29
  • 1
    Can we see the code that was supposed to round it to two digits? It won't round itself. (Note that you *cannot* round it in-place. You must round it and convert it to a string in the same step.) – David Schwartz Jun 24 '21 at 21:48
  • Using a decimal value as a hash is a bad idea, in general you shouldn’t rely on equality for decimal values. Using an Int or a String would make more sense. – EmilioPelaez Jun 24 '21 at 22:14
  • at the end we are using a third party service that accepts Double values only. so converting it to string out of question and the values have decimal points so int will not work here as we will lose decimal points. – Joe Jun 24 '21 at 22:31
  • Python (on a past release) switched the way to convert float to string, in order to prefer the most compact notation (with exactly the same binary representation). You may want to check the release notes of Python to find the algorithm, and just implement it yourself. (at the end, you need a stable representation, so a round trip, e.g. string to float to string may be enough) – Giacomo Catenazzi Jun 25 '21 at 07:42
  • 1
    "I'm converting this double to decimal value." I assume you're doing something like `Decimal(value)`. That will just encode the binary-rounded value (which is what you're trying to avoid). `Decimal(33.48)` is 33.47999999999999488 because that's the Double value you've passed to Decimal.init. If you want to create a Decimal that is decimal rounded rather than binary rounded, you need to create it from a string: `Decimal("33.48")!`. If you encode that, it will work the way you expect. – Rob Napier Jun 25 '21 at 14:23

2 Answers2

2

Decimal values created with underlying Double values will always create these issues.

Decimal values created with underlying String values won't create these issues.

What you can try to do is -

  1. Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
  2. Expose another computed Decimal value that uses this underlying String value.
import Foundation

class Test: Codable {
    // Not exposed : Only for encoding & decoding
    private var decimalString: String = "33.48"
    
    // Work with this in your app
    var decimal: Decimal { 
        get { 
            Decimal(string: decimalString) ?? .zero
        }
        set { 
            decimal = newValue 
            decimalString = "\(newValue)"
        }
    }
}

do {
    let encoded = try JSONEncoder().encode(Test())
    print(String(data: encoded, encoding: .utf8))
    // Optional("{\"decimalString\":\"33.48\"}")
    
    let decoded = try JSONDecoder().decode(Test.self, from: encoded)
    
    print(decoded.decimal) // 33.48
    print(decoded.decimal.nextUp) // 33.49
    print(decoded.decimal.nextDown) // 33.47
} catch {
    print(error)
}
Tarun Tyagi
  • 9,364
  • 2
  • 17
  • 30
1

I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.

Don't do this. Just don't. It's not sensible.

I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?

Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.

So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.

This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.

And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

David Schwartz
  • 179,497
  • 17
  • 214
  • 278
  • yes thats a brilliant point and I'm really hesitant to use hashing with such values as the chances of values being represented in the different formats can't be avoided. I'm open to change the validation process for my objects. If you can suggest some other ways that include these type of values. – Joe Jun 24 '21 at 22:04
  • @Joe Well, what type of values are they? Are they integral numbers of hundredths? If the values aren't precise and discrete, then don't use those types of values in applications that require hashing. Data that doesn't have "one and only one" correct representation should not be hashed, period. Is there some way to change the design so there is one right representation? – David Schwartz Jun 24 '21 at 22:08
  • Its kind of a shopping cart object with different type of values including strings and double/decimal values. and before proceeding to checkout we need to verify the object if its tempered or not. and commonly tempering is done with prices so thats why these double/decimal values are important for hashing. We are converting all the double and decimal points to two places to be like xx.xx but after encoding the values are changed. – Joe Jun 24 '21 at 22:26
  • @Joe Store and represent them as integer numbers of pennies. – David Schwartz Jun 24 '21 at 23:01