I searched for the answer past days, and lot of them are very old (around swift 2 and 1.2). I wanted to get characters when unicode code is taken from the variable. Because for some unkown reason that consturction won't work in Swift:
print("\u{\(variable)}") // should be proposal for including this in Swift 6
people advice to use UnicodeScalars. However Apple must introduced something new in Swift 5. I found some tutorial here but this code fragment
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)) // 大 error here
let string2 = String(UnicodeScalar(value2)) // error here
is not working, with string1 and string2 I get error "no exact matches in call to initalizer". So as when author posted that I understand it must worked in previous version of Swift but with latest they do not. What is changed under the hood? Section for Strings in Apple handbook does not reveal anything.
I am trying to rewrite in Swift some code from Typescript and in JS is so simple like:
for (let i = str.length; i >= 1; i -= 2) {
r = String.fromCharCode(parseInt("0x" + str.substring(i - 2, i))) + r;
}
and I'm struggling with this for past 2 days without effect!