According to the answer to this question std::wstring
can either be u16string
or u32string
.
According to the first answer to this question one can simply convert to u16string
and get std::wstring
as a result.
What I wonder is: how do I know if I have 16 or 32-bit representation? If I want to convert UTF8 to std::wstring
, tt looks like I can't use the solution given because I don't know what the run-time will be.
So, now how do I convert it properly? Or this is not relevant and the conversion will always succeed in that case independently if I have 16-bit or 32-bit representation without ever losing anything?
Can someone please clarify?
EDIT:
All this comes from the fact that here on my Windows-based laptop (Win8.1) with MSVC 2010, converting the string "abc" ("abc") fails with following code:
std::wstring_convert<std::codecvt_utf8<wchar_t> > myconv;
std::wstring table_name = myconv.from_bytes( (const char *) tableName );
I didn't try it yet on Linux/Mac, but seeing that Windows is failing tells me it's not a good sign and I'm doing something wrong.