I thought I understand Unicode scalars in Swift pretty well, but the dog face emoji proved me wrong.
for code in "".utf16 {
print(code)
}
The UTF-16 codes are 55357
and 56374
. In hex, that's d83d
and dc36
.
Now:
let dog = "\u{d83d}\u{dc36}"
Instead of getting a string with "", I'm getting an error:
Invalid unicode scalar
I tried with UTF-8 codes and it didn't work neither. Not throwing an error, but returning "ð¶" instead of the dog face.
What is wrong here?