I know this is a simple question but I'm confused. I have a fairly typical gcc warning that's usually easy to fix:
warning: comparison between signed and unsigned integer expressions
Whenever I have a hexadecimal constant with the most significant bit, like 0x80000000L, the compiler interprets it as unsigned. For example compiling this code with -Wextra will cause the warning (gcc 4.4x, 4.5x):
int main()
{
long test = 1;
long *p = &test;
if(*p != 0x80000000L) printf("test");
}
I've specifically suffixed the constant as long, so why is this happening?