0

I want to convert Hexa value into decimal. So I have tried this. When value is >0, then it is working fine. But when value is <0, then is returns wrong value.

let h2 = “0373”
let d4 = Int(h2, radix: 16)!
print(d4) // 883

let h3 = “FF88”
let d5 = Int(h3, radix: 16)!
print(d5) // 65416

When I am passing FF88, then it returns 65416. But actually it is -120.

Right now, I am following this Convert between Decimal, Binary and Hexadecimal in Swift answer. But it didn't works always.

Please check this conversation online. Please see following image for more details of this conversation.

enter image description here

Is there any other solution for get minus decimal from Hexa value.?

Any help would be appreciated !!

Meet Doshi
  • 4,241
  • 10
  • 40
  • 81

2 Answers2

6

FF 88 is the hexadecimal representation of the 16-bit signed integer -120. You can create an unsigned integer first and then convert it to the signed counterpart with the same bit pattern:

let h3 = "FF88"
let u3 = UInt16(h3, radix: 16)!  // 65416
let s3 = Int16(bitPattern: u3)   // -120
Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382
  • The title says "Convert Hexa to **Decimal** in Swift" How is UInt16 or Int16 considered a Decimal ? – iOS Blacksmith Sep 06 '22 at 07:53
  • @iOSBlacksmith: As I understand it from the question, “decimal” here does not refer to the Swift `Decimal` type, but just means an integer (which, when printed, shows the base-10 representation of that number). – Martin R Sep 06 '22 at 07:58
1

Hexadecimal conversion depends on integer Type ( signed , unsigned ) and size ( 64 bits , 32 bits , 16 bits . . ) , this is what you missed .

source code :

let h3 = "FF88"
let d5 = Int16(truncatingBitPattern: strtoul(h3, nil, 16))
print(d5) // -120
roy
  • 6,685
  • 3
  • 26
  • 39