1

Basically the problem is that in this example:

let d1 = NSNumber(value: 1.4);
let d2 = d1.doubleValue;

let f1 = NSNumber(value: Float(1.4));
let f2 = d1.floatValue;

d1 results 1.4
d2 results 1.3999999999999999

f1 results 1.4
f2 results 1.3999999999999998

Does anyone know why is that?

I'm trying to parse JSON file like:

{"name": "something", "version": 1.4}

with the following code:

let json = try (JSONSerialization.jsonObject(with: someData) as? [String: Any])!;
let version: Double = (json["version"] as! NSNumber).doubleValue;

OR

let version: Double = json["version"] as! Double;

OR

let version: Double = json["version"] as! Float;

And I just can't get 1.4...

Rounding the number is not a solution for me, because I want to write back this number to JSON file, that will be parsed by other programs/languages and needs to be exactly 1.4 in the file.

Any suggestions?

UPDATE: The problem is only with 1.1 and 1.4. There is no problem with 1.2, 1.3, 1.5

UPDATE 2: Serialization code:

    let jsonDict: Dictionary<String,Any> = [
        "name" : name,
        "version" : version
    ];

    let data = try? JSONSerialization.data(withJSONObject: jsonDict, options: []);
    let jsonString = String(data:data, encoding:.utf8);
Dimitar Atanasov
  • 141
  • 1
  • 15
  • Are you trying to use a floating-point object to hold a version number? Do not do that. Version numbers are not mathematical numbers. They are identification strings. Hold and manipulate it as a string. – Eric Postpischil Jan 22 '18 at 23:28
  • 1
    Consider converting the string into an `NSDecimalNumber` – James Bucanek Jan 23 '18 at 04:28
  • @EricPostpischil Software versioning usually consist from Major number, minor number, build number and revision number in format of xx.xx.x.xxx. In this case I'm combining major number and minor number to compare if current version is bigger(newest) than current one. However managing version number is not what I'm asking for. I just want to know how to parse decimal number from JSON file with 100% accuracy, no matter if it's software version or something else! I come from C#, PHP, JS - didn't had such case before... – Dimitar Atanasov Jan 23 '18 at 09:21
  • @JamesBucanek It's not a string I'm getting from the JSON, but a number. Please note that here `{"name": "something", "version": 1.4 }` there are no quotes around `1.4`. So this is interpreted as number... – Dimitar Atanasov Jan 23 '18 at 09:26
  • Even if you get the version identifer as a number, comparing two versions as numbers will fail since 1.10 is mathematically less than 1.4 but is a later version string. Version identifiers are text. JSON is text. Extract the version identifier from JSON as text, not a number. [Here](https://stackoverflow.com/questions/35815469/swift-sort-array-of-versions-as-strings/35815636#35815636) is a question about comparing strings containing version identifiers in Swift. – Eric Postpischil Jan 23 '18 at 12:36
  • @EricPostpischil unless version is structured until 1.9 and next one is 2.0. As I previously said, I don't want versioning lesson, but a way to parse decimal(or float or double) number exactly as it is and then write it to file exactly as it is! As for the JSON, please educate yourself by going to [json.org](https://www.json.org/). It's a key-value structure. Keys are strings, but values can be string,number,object,array,boolean,null. That's why in the my code example above, if you try `let d1 = product["version"] as! String;` it will fail with `Unexpected Error (signal SIGABRT)`. – Dimitar Atanasov Jan 23 '18 at 15:26
  • If you however try `let d2 = (product["version"] as! NSNumber).stringValue;` you will get `1.4` and if you try `let d3 = Double((product["version"] as! NSNumber).stringValue)!;` you will get `1.3999999999999999`, and when try to write it again in the JSON file as a number it will be still `1.3999999999999999`. Changing JSON structure to string only is not a solution, because other apps on work with this file as well. Once again, forget about versioning is not the main issue here. The strangest think is that this is issue only with `1.4` while `1.3` and `1.5` don't seems to have this problem... – Dimitar Atanasov Jan 23 '18 at 15:27
  • @DimitarAtanasov: Re “unless version is structured until 1.9 and next one is 2.0.” Are they so structured? – Eric Postpischil Jan 23 '18 at 15:34
  • 1
    @DimitarAtanasov: it is impossible to parse “1.4” as exactly 1.4 in binary floating-point because 1.4 cannot be represented in binary floating-point. If you insist on using a numeric format for version identification, you must use a decimal floating-point format. (By the way, the JSON information you point at says “A number is a sequence of decimal digits with no superfluous leading zero.” It only defines this sequence of characters, not an interpretation as a mathematical number or otherwise. It says “JSON is agnostic about the semantics of numbers.”) You might look into NSDecimal. – Eric Postpischil Jan 23 '18 at 15:39
  • 1
    @DimitarAtanasov: 1.5 can be exactly represented. With 1.3, the rounding errors that occur just happen to work out. – Eric Postpischil Jan 23 '18 at 15:39
  • @EricPostpischil Thanks for the numbers clarifications, that's more in the subject now :). I know that this results are due to memory representation of the double/float number. I'm just wondering how is this bug just in Swift, but not languages like C#, Java, JS, PHP? I'm doing some tests with NSDecimal now and will post the result here. – Dimitar Atanasov Jan 23 '18 at 15:50
  • Sorry, I wasn't clear. I didn't mean to suggest that you're getting a string that you should convert into `NSDecimalNumber`, I meant to imply that it might already be available as an `NSDecimalNumber` (a subclass of `NSNumber`). If so, you can cast it appropriately and use it directly without any loss of precision. Have you tried setting a breakpoint and checked what kind value object `json["version"]` is? Also, if the value prints as `"1.4"` then, by definition, it can be converted to a string without any loss—so examine its `description`, or break that into components and compare those. – James Bucanek Jan 23 '18 at 16:42
  • Almost certainly related to https://stackoverflow.com/questions/588004/is-floating-point-math-broken. – Mark Ransom Jan 23 '18 at 16:46
  • @JamesBucanek Yes. Decimal type did the trick, thanks for the help! – Dimitar Atanasov Jan 23 '18 at 16:55
  • @MarkRansom Yes, the question is related. But the issue here is **NOT** `why is this happening`, **BUT** `how to handle it` in Swift. The solution is just to switch from Double/Float to Decimal type. – Dimitar Atanasov Jan 23 '18 at 16:59
  • 1
    Understanding the underlying issue is half the battle. I am unable to contribute to the solution because I'm not familiar with Swift. But I will reiterate what @EricPostpischil says - everything contained in a JSON file is a string, and if your program is reading or writing a number then something somewhere is converting it. You need tighter control of that conversion. Evidently the Decimal type does a different conversion. – Mark Ransom Jan 23 '18 at 17:14
  • @MarkRansom :)) JSON is indeed transfered as plain text just like XML. Do you know the difference between `{"key": 5}`, `{"key": "5"}`, `{"key": {"subkey": 5}}` and `{"key": [2,5]}` ? Case 1 - **number**, case 2 - **string**, case 3 - **object**, case 4 - **array**. The whole representation is in text, but there is only one simple cross-platform rule for parsing this. All technologies and languages follow this rule. Of course you can make parser by your own and parse this differently, but it won't be the right way...If the value is not in quotes it shouldn't be treated as String. – Dimitar Atanasov Jan 23 '18 at 17:48
  • There are also rules for writing values out to JSON. If you were seeing a value of 1.3999999999999999 in a debugger and *assuming* it would be written to JSON that way without trying it, then this was a very foolish question. – Mark Ransom Jan 23 '18 at 18:36
  • @MarkRansom Of course I did try it. This was the main problem. JSON number of 1.4 is parsed as 1.3999999999999999in Double/Float. When I preserve this value, and serialize it back to JSON it's written as 1.3999999999999999. It's all resolved now. Let's cut it. Thanks for the support. – Dimitar Atanasov Jan 23 '18 at 18:59
  • I'd say then that the process of serializing back to JSON is where the problem is, and you didn't show that code so there's not much to say. I'd love to drop this, but the question may stick around for Google to find for years and it would be better to have a full resolution rather than a workaround. – Mark Ransom Jan 23 '18 at 19:16
  • Well I'd say that the problem comes from both - serialization and deserialization. Probably the way that SWIFT handle Float/Double. I'm using the integrated SWIFT `JSONSerialization` class and the official way of encode/decode JSON. I've pasted the deserialization code above in the question. I didn't want to overcomplicate things by pasting more code, but I'll add it for you. Nothing uncommon about it. Two lines of code in each way. – Dimitar Atanasov Jan 23 '18 at 19:43

2 Answers2

5

Ok, just to finalise the discussion.

At the end Decimal type did the trick. So I changed all variable references to Decimal and NOT NSDecimalNumber, because I got error that it doesn't comply with Codable and Decodable protocols. Maybe there is a workaround for this, but the easiest solution is just to stick with Decimal.

I would like to thanks to @JamesBucanek and @EricPostpischil for joining the discussion and help resolving this issue !!!

Dimitar Atanasov
  • 141
  • 1
  • 15
  • 3
    Thanks! Also note that `Decimal` initialization from `Double` also results to this problem. Convert `Double` to `String` first, e.g. `Decimal(string: "1.4")`. [Ref](https://skagedal.github.io/2017/12/30/decimal-decoding.html) – Shawn Jul 30 '18 at 08:58
1

I had the same story.

I need to sent/receive float values using API endpoint. So I followed advice and changed Double to Decimal. It worked fine for encoding data, but not for decoding.

let value = "0.0006"
let decimal = Decimal(string: value)!
print(decimal) // 0.0006 OK
let jsonData = try JSONEncoder().encode(decimal)
print(String(data: jsonData, encoding: .utf8)!) // 0.0006 OK
let decocdedDecimal = try JSONDecoder().decode(Decimal.self, from: jsonData)
print(decocdedDecimal) // 0.0005999999999999998976 NOOOOOO!!!

However decode to Double works fine.

let decocdedDouble = try JSONDecoder().decode(Double.self, from: jsonData)
print(decocdedDouble) // 0.0006 OK

Also as was mentioned in answers above - Decimal should be inited with String to be encoded correctly

let decimal = Decimal(0.0006)
print(decimal) // 0.0005999999999999998976, it will be encoded same way

And ofcourse Double encoding not working as expected, otherwise we didn't have such issue.

// 0.00059999999999999995 for all cases ofcourse
print(String(data: try JSONEncoder().encode(0.0006), encoding: .utf8)!)
print(String(data: try JSONEncoder().encode(decocdedDouble), encoding: .utf8)!)
print(String(data: try JSONEncoder().encode(Double(value)!), encoding: .utf8)!)

So my dirty solution for now is to use wrapper for Double values. It will decode value as Double, but encode as Decimal(string:)

struct CodableDouble: Codable {
    var value: Double
    
    init(_ value: Double) {
        self.value = value
    }
    
    init(from decoder: Decoder) throws {
        value = try decoder.singleValueContainer().decode(Double.self)
    }
    
    func encode(to encoder: Encoder) throws {
        var container = encoder.singleValueContainer()
        let decimal = Decimal(string: "\(value)") ?? 0
        try container.encode(decimal)
    }
}

But I still don't understand what is the correct way to handle this issue. (except not using float values)

Kirow
  • 1,077
  • 12
  • 25