5

I'm trying to use the Unicode character ""('\uD83D\DC50') in C++, but the compiler reports the following error: "invalid universal character".

Tsundoku
  • 2,455
  • 1
  • 19
  • 31
Bali
  • 749
  • 6
  • 19
  • 6
    Please post your code! – Programmer Oct 22 '16 at 09:19
  • 5
    That looks like an UTF-16 surrogate pair. You might try `\u0001F450` instead. Not sure if that will work with any compiler though, and even if it did you'd still have the problem of *presenting* the emoji correctly. – Cheers and hth. - Alf Oct 22 '16 at 11:41
  • it worked "\U00101f450" by using like this, this way it can support universal character set in c++. Thanks for help – Bali Oct 24 '16 at 12:13
  • @Cheersandhth.-Alf Check this [Rules for C++ string literals escape character](https://stackoverflow.com/questions/10220401/rules-for-c-string-literals-escape-character). Wow it's new to me and I wonder why *C++ Primer* say nothing about this. Btw for non BMP pairs, `\U` (capitcal U) should be used. – Rick Apr 23 '20 at 18:16
  • It's interesting that when copying a non BMP unicode(e.g. this [one](https://unicode-table.com/en/1D11E/) ) from Chrome to CLion, it's pasted into 2 surrogate pairs. While copying 2 surrogate pairs from CLion to Chrome, it shows the unicode character immediately. – Rick Apr 23 '20 at 18:20

0 Answers0