18

I have a problem I couldn't find a solution to. I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character .

Usually one would do something like this:

println("\u{1f44d}") // 

Here is what I mean:

let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working

I have tried several other ways but somehow the workings behind this magic stay hidden for me.

One should imagine the value of charAsString coming from an API call or from another object.

yesman82
  • 431
  • 1
  • 3
  • 12

7 Answers7

13

One possible solution (explanations "inline"):

let charAsString = "1f44d"

// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {

    // Create string from Unicode code point:
    let str = String(UnicodeScalar(charCode))
    println(str) // 
} else {
    println("invalid input")
}

Slightly simpler with Swift 2:

let charAsString = "1f44d"

// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
    // Create string from Unicode code point:
    let str = String(UnicodeScalar(charCode))
    print(str) // 
} else {
    print("invalid input")
}

Note also that not all code points are valid Unicode scalars, compare Validate Unicode code point in Swift.


Update for Swift 3:

public init?(_ v: UInt32)

is now a failable initializer of UnicodeScalar and checks if the given numeric input is a valid Unicode scalar value:

let charAsString = "1f44d"

// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
    let unicode = UnicodeScalar(charCode) {
    // Create string from Unicode code point:
    let str = String(unicode)
    print(str) // 
} else {
    print("invalid input")
}
Community
  • 1
  • 1
Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382
  • Reusing another user answer to improve your own answer just seemed to me like an awful way to do things here… But I might be wrong maybe it's something usual. – Fantattitude Jul 29 '15 at 09:54
  • 1
    @Fantattitude: Believe me or not, but I did not see your answer before I edited this answer using code from **my own** previous answer http://stackoverflow.com/a/27190430/1187415. – Martin R Jul 29 '15 at 09:56
  • I do believe you don't worry just found it a bit harsh. I'm beginning participating after years of only reading so I may make mistakes. Still you could improve your answer by getting rid of the radix (the default value seems to work fine here). – Fantattitude Jul 29 '15 at 09:58
  • 1
    @Fantattitude: No, it does not, your code actually crashes if I try it. – Martin R Jul 29 '15 at 10:00
  • @MartinR Oh well playground fooled me, didn't update after I removed it. – Fantattitude Jul 29 '15 at 10:03
  • May I suggest using Int with radix and printing the UnicodeScalar directly (without using a String) ? It's shorter and gives every hints needed to answer the question I think. (+ looks less like black magic to some people not understanding all the Int types) – Fantattitude Jul 29 '15 at 10:15
  • @Fantattitude: `UInt32` is the underlying value type of `UnicodeScalar`, so using the `init(_ v: UInt32)` constructor seems more natural to me. It is correct that there is also a `init(_ v: Int)` constructor (defined in an extension of UnicodeScalar), but I see no advantage using it. As an example `let i = -1 ; let u = UnicodeScalar(i)` compiles, but crashes at runtime. – I also prefer the explicit constructor `String(...)` here instead of string interpolation `"\(uScalar)"` (which does black magic as well :). – Martin R Jul 29 '15 at 10:52
  • I knew you'd say this about the underlying type :P. You can also print(uScalar) without going through String interpolation, or is it doing it behind when type isn't supported? Looks more like UnicodeScalar is CustomStringConvertible to me but can't check for now. – Fantattitude Jul 29 '15 at 12:01
  • @HemangPandya: I have rejected your edit suggestion because the code is (to the best of by knowledge) still correct. Please let me know if there is a problem with it! – Martin R Dec 09 '16 at 14:58
12

This can be done in two steps:

  1. convert charAsString to Int code
  2. convert code to unicode character

Second step can be done e.g. like this

var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"

As for first the step, see here how to convert String in hex representation to Int

Community
  • 1
  • 1
Jakub Vano
  • 3,833
  • 15
  • 29
  • Short version of converting a hex string like "1f44d" to an Int: `Int(strtoul(charAsString, nil, 16))` – yesman82 Jul 29 '15 at 10:20
  • This worked well for me; I did convert vars to lets and used a guard on the scalar step to avoid it printing an optional. – TahoeWolverine Aug 16 '21 at 15:23
7

As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).

UPDATED: Swift 3.0 changed UnicodeScalar initializer

print("\u{1f44d}") // 

let charAsString = "1f44d" // code in variable

let charAsInt = Int(charAsString, radix: 16)! // As indicated by @MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping

print("\(uScalar)")
Fantattitude
  • 1,842
  • 2
  • 18
  • 34
  • 1
    Using Swift 3.0. Your answer does not print the desired result unless I also add the exclamation point to let uScalar = UnicodeScalar(charAsInt)!, otherwise what I see is Optional("\u{0001F44D}"). – svohara Sep 16 '16 at 03:50
1

You can use

let char = "-12"
print(char.unicodeScalars.map {$0.value }))

You'll get the values as:

[45, 49, 50]
Avinash
  • 4,304
  • 1
  • 23
  • 18
1

Here are a couple ways to do it:

let string = "1f44d"

Solution 1:

"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)

Solution 2:

"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
hkdalex
  • 717
  • 7
  • 13
1

I made this extension that works pretty well:

extension String {
    var unicode: String? {
        if let charCode = UInt32(self, radix: 16),
           let unicode = UnicodeScalar(charCode) {
            let str = String(unicode)
            return str
        }
        return nil
    }
}

How to test it:

if let test = "e9c8".unicode {
    print(test)
}

//print: enter image description here

Joule87
  • 532
  • 4
  • 13
0

You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:

let charAsString = "1f44d"
print("\u{\(charAsString)}")

You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:

let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // 
Imanou Petit
  • 89,880
  • 29
  • 256
  • 218