There's this piece of code that works fine in QtCreator (on Windows 7) but behaves differently when running on a linux based embedded platform and I'm not sure how to begin debugging it.
Here's the code in question:
QByteArray c(1, char(0x00));
bool bOk = false;
int intVal= c.toHex().toInt(&bOk,16);
if(bOk) {
qDebug() << "conversion success \t" << intVal;
}
else {
qDebug() << "conversion failed \t" << intVal;
}
In QtCreator (running in Windows) this works fine and intVal
has the value 0, as expected.
However, when compiled and run in the embedded Linux environment, the conversion fails and bOk
is false.
Strangely for c = 0xFF
it works properly in both environments.
Details
Qt Version : 4.7
Environment 1 : Windows
- OS : Windows 7
- Environment : Qt-Creator
- sizeof(int) = 4 bytes
- Processor : It's an old machine with an Intel Pentium Core-2 Duo (Unfortunately I can't remember the exact model). They're all x86 architecture though so that would be little-endian.
Environment 2 : Linux
- OS : A custom Embedded Linux (not sure which version or how to find out)
- Environment : A cross-compiled program that is running on this target.
- sizeof(int) : 4 bytes
- Processor Endianess : ARM7, so that's BE-32 Endianess (according to this answer)