0
  • char: 8-bit character type
  • char16_t: 16-bit character type
  • char32_t: 32-bit character type
  • wchar_t: 16 ~ 32-bit character type depending on the compiler  
  • UTF-8: An encoding method that can represent Unicode characters while preserving 8-bit characters
  • UTF-16: How to encode Unicode characters using one or two 16-bit values
  • UTF-32: How to encode all Unicode characters to 32 bits

It is like above, but it does not understand well. Did it simply implement UTF-8, UTF-16, UTF-32 as char, char16_t, char32_t types?

  • You should learn about UTF-8, the other encoding methods are losing favour these days. – rleir Apr 29 '18 at 02:16
  • 2
    Your should read this: https://www.joelonsoftware.com/2003/10/08/the-absolute-minimum-every-software-developer-absolutely-positively-must-know-about-unicode-and-character-sets-no-excuses/. Clearly, the answer to your question is No. (Hint: you need to understand the distinction between types and encodings.) – Stephen C Apr 29 '18 at 02:19
  • Read about [UTF-8](https://en.wikipedia.org/wiki/UTF-8), [UTF-16](https://en.wikipedia.org/wiki/UTF-16), and [UTF-32](https://en.wikipedia.org/wiki/UTF-32) – Remy Lebeau Apr 29 '18 at 06:37

0 Answers0