7

I'm writing a DLL and want to be able to switch between the unicode and multibyte setting in MSVC++2010. For example, I use _T("string") and LPCTSTR and WIN32_FIND_DATA instead of the -W and -A versions and so on.

Now I want to have std::strings which change between std::string and std::wstring according to the unicode setting. Is that possible? Otherwise, this will probably end up getting extremely complicated.

Felix Dombek
  • 13,664
  • 17
  • 79
  • 131
  • possible duplicate of [Is there an string equivalent to LPTSTR?](http://stackoverflow.com/questions/1824420/is-there-an-string-equivalent-to-lptstr) – Kirill V. Lyadvinsky May 26 '11 at 05:11

1 Answers1

12

Why not do like the Win32 API does: Use wide characters internally, and provide a character-converting facade of DoSomethingA functions which simply convert their input to Unicode.

That said, you could define a tstring type like so:

#ifdef _UNICODE
typedef std::wstring tstring;
#else
typedef std::string tstring;
#endif

or possibly:

typedef std::basic_string<TCHAR> tstring;
bdonlan
  • 224,562
  • 31
  • 268
  • 324
  • 2
    Note that you'll want to do the same for iostream types as well; e.g. instead of using `std::fstream`, make a typedef for `std::basic_fstream` and use that instead. – ildjarn May 26 '11 at 01:31
  • @bdonlan: Thanks! So your first sentence means that I should always compile with Unicode enabled and just convert narrow input as soon as I receive it? I thought about that, too. One problem with that is that I have no idea how to receive a narrow BSTR when I compile with unicode enabled. – Felix Dombek May 26 '11 at 01:33
  • 2
    @Felix : There's no such thing as a 'narrow' BSTR -- BSTR is, by definition, always UTF-16. – ildjarn May 26 '11 at 01:38
  • But ... VB6 Strings are BSTRs in C++, right? Then, why does my program work when I receive them as LPCSTR but not when I receive them as BSTR? Until now, I only compiled in Unicode mode ... – Felix Dombek May 26 '11 at 01:42
  • 2
    BSTR is always 16 bit on Windows: http://msdn.microsoft.com/en-us/library/ms221069.aspx –  May 26 '11 at 01:44
  • OK, I read that too ... I still don't get the origin of this problem here, though ... Strings I get from VB6 always come as char*, not wchar_t*. Why? – Felix Dombek May 26 '11 at 01:47
  • 1
    If you're using just a regular DLL export, you might be dealing with an ordinary string, not a BSTR - BSTRs are generally passed across COM interfaces. I haven't worked much with VB6 bindings, so I can't say for sure, but that might be the cause of your issues. In any case, `BSTR` will still be 16-bit even if you build for MBCS. – bdonlan May 26 '11 at 01:48
  • 1
    I googled this a little and it appears that VB6 does not support Unicode natively, but does know how to convert it when dealing with API calls or file I/O. You may need to look into using byte arrays with your DLL instead of a string, and passing them into controls that are BSTR-aware. –  May 26 '11 at 02:00
  • Passing BSTRs back is not a problem, it works flawlessly. I have come to the conclusion that it is too difficult to get my program to compile in both unicode and multibyte modes, mostly because of boost::filesystem, and I will go with the original idea of bdonlan: -A and -W versions of my functions and only using unicode internally. – Felix Dombek May 26 '11 at 02:24
  • 2
    If you declare a function in VB6 as taking a `String` argument, then VB6's UTF-16 string will be converted to an "ANSI" string during the call. Declare the argument as `ByVal … As Long`, and pass the `StrPtr`, then the pointer to the original UTF-16 data is passed. – Philipp May 28 '11 at 06:32