13

My current understanding of the difference between std::string and std::wstring is simply the buffer's type; namely, char vs wchar_t, respectively.

I've also read that most (if not all) linux distros use char for any and all strings, both ASCII as well as UTF, where Windows is the primary OS that uses wchar_t anymore.

However, there are a few more string types that I want to get straight in my head: u16string and u32string, which are strings with 2-byte and 4-byte buffers, respectively.

So, my question is this:

On platforms with sizeof(wchar_t) == 2, is std::wstring functionally equivalent to std::u16string, as well as platforms with sizeof(wchar_t) == 4 and std::u32string?

Qix - MONICA WAS MISTREATED
  • 14,451
  • 16
  • 82
  • 145

1 Answers1

21

The difference is that the details of char and wchar_t are implementation defined, while the encoding of char16_t and char32_t are explicitly defined by the C++11 standard.

This means that wstring is likely to store the same data as either u16string or u32string, but we don't know which one. And it is allowed for some odd implementation to make them all different, as the size and encoding of the old char types are just not defined by the standard.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
Bo Persson
  • 90,663
  • 31
  • 146
  • 203
  • 1
    `wstring` will *never* be the same as those other string types. The standard *requires* `wchar_t` to be a distinct type from `char16_t` and `char32_t`. They may in fact have the same underlying type and the same encoding, but `is_same` will *never* be `true` for any valid C++ implementation. – Nicol Bolas May 28 '16 at 20:52
  • 1
    "The same" here means "functionally equivalent", like it says in the question. If `sizeof(wchar_t) == 4` you will very likely get the same result from using either a `wstring` or a `u32string`. – Bo Persson May 29 '16 at 10:02