I'm reading the code of an iPhone sample project (Xcode IDE, Apple LLVM compiler 4.2). In a header file of an external library (written in C) for that iPhone sample project, there're some events declared in enumeration type:
typedef enum _Application_Events
{
EVENT_EXIT = 0x80000000,
EVENT_TOUCH,
EVENT_DRAG,
EVENT_RELEASE_TOUCH,
EVENT_ROTATE_0,
EVENT_ROTATE_90,
EVENT_ROTATE_180,
EVENT_ROTATE_270
} Application_Events;
I don't understand what kind of values are assigned to those events. Is 0x80000000
supposed to be a big positive integer (2147483648
), or negative zero, or a negative integer (-2147483648
)?
I inspected in Xcode debugger, with the compiler being Apple LLVM compiler 4.2, the EVENT_EXIT
equals (int) -2147483648
and the EVENT_RELEASE_TOUCH
equals (int) -2147483645
and so on.
Apparently, they're treated in two's complement representation. An related post can be found here.
But what I'm not sure about now are these:
(1) The underlying data type for 0x80000000
always being int
or something else in other situations? Is this depended on compiler or platform?
(2) If I assigned a hexadecimal value to a signed integer like this, is it always interpreted as the two's complement representation? Is this depended on compiler or platform? A related post can be found here. Another reference can be found here.
Please share some ideas. Thank you all :D