2

or can I check if a number was decoded as a decimal number and not and integer later?

if let int = any as? Int {
  print("Object in an integer")        
} else if let num = any as? Double {
  print("Object in a double")
}

, where "any" is an Any value and = 1.0 (not a string) in the JSON file. "any" can be cast to both integer and double (so the order of which I check determines the outcome), but I would like to keep the original format from the JSON file.

Decoding is done using the following line:

let json = try JSONSerialization.jsonObject(with: data, options: [])

Edit: I've tried checking CFType, but get the same for both 1 and 1.0 (inspired by http://stackoverflow.com/a/30223989/1694526)

Any ideas?

swebal
  • 431
  • 3
  • 14
  • Json makes no distinction between integers and decimals, it's just a number. Swift parses them to objc NSNumber which can be converted to both types but internally it uses a double. If you want precise parsing, write your own parser or send numbers as strings. – Sulthan Jan 19 '18 at 10:25
  • Yes, well something similar was said about boolean values and NSNumber a while ago and then someone found a great solution for that, so... – swebal Jan 19 '18 at 10:39
  • 1
    you would generally know whether your API is providing a particular field as a decimal or an integer. So you should be able to cast it according to your API knowledge. if you really aren't sure what to expect why not just treat it as a double/decimal value and then you wont lose any information during the cast – Scriptable Jan 19 '18 at 10:52
  • @swebal Boolean is rather different thing. JSON makes a difference between a boolean and a number. However, `2`, `2.0` and `2.01` are all decimals. – Sulthan Jan 19 '18 at 12:15
  • The app in question will be a JSON editor, so it would be nice to keep the original formatting, but I'll find another solution. – swebal Jan 19 '18 at 15:49

2 Answers2

2

As already mentioned by @Sulthan this is not possible on the level you are working as JSONSerialization will and should use a single class to represent a value and may not determine its type.

You could try finding some other tool to check for values but does it really make sense?

  1. You are trying to look for differences between Int and Double but what about 64 or 32 bit? Or signed and unsigned? We usually don't write those into strings so there really is no way to distinguish between them. So there is really no general logic in doing so.
  2. Are you sure the returned JSON will always have ".0" appended for these values? This really depends on the system and a smallest optimization would trim that because JSON standard does not include precisions on numbers. For instance if I use JSONSerialization and print out String(data: (try! JSONSerialization.data(withJSONObject: [ "value": 1.0 ], options: .prettyPrinted)), encoding: .utf8) I receive: {\n \"value\" : 1\n} which means it trimmed ".0" anyway.
  3. I find it hard to understand how this would be good structurally. If you need to save these data for instance into your database you will need to define the size and type of the primitive to hold your data. If you need to use some arithmetics you again need to specify the type...

The only way would be to use it as a display string. But in that case your value should be returned as a string and not as a number.

Matic Oblak
  • 16,318
  • 3
  • 24
  • 43
  • One of the biggest problem with JSON decoder in CocoaTouch is the fact that it never uses `NSDecimalNumber` (`Decimal` in Swift) when parsing. JSON does not put any limit on the number of decimal places, however Cocoa always parses them as a double, with precision loss. That's something that has to be solved by using strings (which requires a change of API) or using a custom parser. Neither is an ideal solution – Sulthan Jan 19 '18 at 12:18
  • @Sulthan Are you absolutely sure about this? I just tried this and seems to work when used properly using `NSDecimalNumber(decimal: (object["value"] as! NSNumber).decimalValue)`. In my text example I used: `let string = "{\"value\" : 0.12345678909876543211234567890}" let data = string.data(using: .utf8)! let object = (try! JSONSerialization.jsonObject(with: data, options: .allowFragments)) as! [String: Any] print("String: \(string)\n\n\nObject: \(object)\n\n\nValue: \(NSDecimalNumber(decimal: (object["value"] as! NSNumber).decimalValue))")` And it worked. – Matic Oblak Jan 19 '18 at 12:38
  • I tested your code on OS X just now and it prints `0.1234567890987654`. – Sulthan Jan 19 '18 at 13:02
  • @Sulthan I honestly just tried this on iPhone, on iPhone simulator and on new OS X app. The results for each of them was "Value: 0.1234567890987654321123456789". Is there anything that could make the difference in your case? – Matic Oblak Jan 19 '18 at 13:08
  • I am not on High Sierra yet. Maybe they have finally fixed it. – Sulthan Jan 19 '18 at 13:12
  • It's only important from an editor standpoint, where a JSON file with 1.0 should be presented as a double value and not an integer. So, it's really not a showstopper. – swebal Jan 19 '18 at 15:51
  • Mojave here, its the same stuff. its not converted to double. – JBarros35 May 28 '20 at 17:25
2

The solution is to parse to an NSNumber and then to a Decimal (or NSDecimalNumber). DO NOT parse via a Double.

let jsonString = "[ 4.01 ]"
let jsonData = jsonString.data(using: .utf8)!
let jsonArray = try! JSONSerialization.jsonObject(with: jsonData, options: []) as! [Any]

// This is the WRONG way to parse decimals (via a Double)
// parseAttemptA = 4.009999999999998976
let parseAttemptA: Decimal = Decimal(jsonArray[0] as! Double)

// This is the CORRECT way to parse decimals (via an NSNumber)
// parseAttemptB = 4.01
let parseAttemptB: Decimal = (jsonArray[0] as! NSNumber).decimalValue

Here's a screenshot of a playground...

Correct Parsing of Decimal in Swift

Oliver Pearmain
  • 19,885
  • 13
  • 86
  • 90
  • It's not a problem if the number is 4.01, it's only an issue if the number is 4.0, as an integer parsing would succeed as well. – swebal Sep 28 '18 at 15:19