Using the following macro:
#define MIN_SWORD (signed int) 0x8000
In e.g. the following expression:
signed long s32;
if (s32 < (signed long)MIN_SWORD)...
is expected to do the following check:
if (s32 < -32768)
One some compilers it seems to work fine. But on some other compiler the exprssion is evaluated as:
if (s32 < 32768)
My question: How is a ANSI-C compiler supposed to evaluate the following expression:
(signed long) (signed int) 0x8000
?
It seems that on some compilers the cast to `(signed int) does not cause the (expected) conversion from the positive constant 0x8000 to the minimum negative value of a signed int, if afterwards the expression is casted to the wider type of signed long. In other words, the evaluated constant is not equivalent to: -32768L (but 32768L)
Is this behavior maybe undefined by ANSI-C?