3

how can I convert QString to LPCSTR ?

How do I do it when #ifdef UNICODE is defined and when it isn't ?

Thanks very much :)

4 Answers4

13

I guess:

QString str("ddddd");
LPCSTR lstr = str.toStdString().c_str();
snoofkin
  • 8,725
  • 14
  • 49
  • 86
2

QString can always hold Unicode; LPCSTR is never Unicode. This means that you do have to consider what to do with the characters that won't fit. This isn't a "which method to use" question, but a design question.

It's quite possible that in your specific case, you absolutely know that the QString only contaisn characters from your local "ANSI" codepage (also known as ACP). In that case, the correct function is QString::toLocal8Bit ().

Alternatively, you might know that the QString only contains characters from Latin1 (ISO 8859-1). In that case, the correct function is QString::toLatin1().

You could try to call QString::toUtf8(). This will always produce a valid byte array, even if the QString contained all Unicode characters. However, formally you can't point a LPCSTR to it: UTF-8 is not a valid ACP codepage. And presumably, you want this LPCSTR to pass to another function outside your control. It's likely that function won't expect UTF-8. If it expected Unicode at all, it would take a LPCWSTR.

MSalters
  • 173,980
  • 10
  • 155
  • 350
0

I found following solution from here and it works flawlessly for me:

void fooSub(LPSTSTR X); // this is our function :-)

foo()
{
    QString text;
    if(sizeof(TCHAR) == 1)
        fooSub((LPCSTR)text.toLocal8Bit().constData()); // here you have to check, how to convert, you could also use utf8(), ...
    else
        fooSub((LPCWSTR)text.utf16());
}
zeFree
  • 2,129
  • 2
  • 31
  • 39
-6
LPCSTR == const char *

it's not unicode, then

LPCSTR s = (const char *)qtString;

Mark.Ablov
  • 857
  • 6
  • 6