1

I'm reading the code of an iPhone sample project (Xcode IDE, Apple LLVM compiler 4.2). In a header file of an external library (written in C) for that iPhone sample project, there're some events declared in enumeration type:

typedef enum _Application_Events
{
    EVENT_EXIT = 0x80000000,
    EVENT_TOUCH,
    EVENT_DRAG,
    EVENT_RELEASE_TOUCH,
    EVENT_ROTATE_0,
    EVENT_ROTATE_90,
    EVENT_ROTATE_180,
    EVENT_ROTATE_270
} Application_Events;

I don't understand what kind of values are assigned to those events. Is 0x80000000 supposed to be a big positive integer (2147483648), or negative zero, or a negative integer (-2147483648)?

I inspected in Xcode debugger, with the compiler being Apple LLVM compiler 4.2, the EVENT_EXIT equals (int) -2147483648 and the EVENT_RELEASE_TOUCH equals (int) -2147483645 and so on.

Apparently, they're treated in two's complement representation. An related post can be found here.

But what I'm not sure about now are these:

(1) The underlying data type for 0x80000000 always being int or something else in other situations? Is this depended on compiler or platform?

(2) If I assigned a hexadecimal value to a signed integer like this, is it always interpreted as the two's complement representation? Is this depended on compiler or platform? A related post can be found here. Another reference can be found here.

Please share some ideas. Thank you all :D

Community
  • 1
  • 1
George
  • 3,384
  • 5
  • 40
  • 64
  • 1
    To answer follow-on questions: (1) It's platform dependent, per one other answer that refers to the C spec. But it's usually an int. (2) The compiler will assume the number is a signed int unless you specify a type by casting or suffix. Hex values are neither positive nor negative by themselves, they are bit patterns, so if your high order bit is set, your value will be considered negative, so if you assign it into a signed type, it will be negative, but if you assign it into an unsigned type it will just have the natural value you expect. – PaulProgrammer Apr 23 '13 at 19:16

4 Answers4

3

Like many things in C-like languages, an enumeration is just an integer. Setting the first value like this will cause the compiler to increment from there, guaranteeing that all enumeration values are less than 0. (as a signed integer by 2s compliment, the high bit being set will indicate a negative number)

Likely, the programmers chose this value to be able to send various kinds of events, and shouldn't collide with the others.

In a nutshell, don't worry about the actual value; it's just a number. Use the name and understand that's supposed to be the meaning in the context of the calls that use or return those codes.

PaulProgrammer
  • 16,175
  • 4
  • 39
  • 56
  • Thank you PaulProgrammer! I agree with you that the original programmers of the above code snippet was trying avoid collision with others flags which are some positive integers such as `0x00`, `0x01`, etc. – George Apr 23 '13 at 12:59
2

The base type of the enumeration is implementation defined. In this case, the base type should be unsigned int, because the standard requires the compiler to pick a base type that is wide enough to hold all enumeration values. From the C99 standard, section 6.7.2.2.4:

Each enumerated type shall be compatible withchar, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined,108) but shall be capable of representing the values of all the members of the enumeration. The enumerated type is incomplete until after the}that terminates the list of enumerator declarations.

108) An implementation may delay the choice of which integer type until all enumeration constants have been seen.

George
  • 3,384
  • 5
  • 40
  • 64
Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523
  • C99 standard (can be found [here](http://cs.nyu.edu/courses/spring13/CSCI-GA.2110-001/downloads/C99.pdf)), section 6.7.2.2.4 is at page 105. – George Apr 23 '13 at 07:58
1

The underlying type of the enum depends of the value it needs to hold. The compiler has some latitude on how that type may ultimately be defined. In your case, it's likely that the underlying type of Application_Events is unsigned int because it's greater than INT_MAX, assuming that int is 32-bits in size (what an enum is generally). But something like:

enum foo_t {
   FOO_Start,
   FOO_Thing,
   FOO_Another_Thing,
   FOO_End
};

The type of enum foo_t could be int or unsigned int.

However, enumeration constants, (e.g., EVENT_EXIT, FOO_Start, etc.) are of type int. That's what you're seeing in the debugger. If you do something like

Application_Events foo = EVENT_EXIT;

the type of foo could be unsigned. This question has changed a little, I think, so:

1) For iPhone the constant 0x80000000 is probably unsigned (the iPhone ARM processor has 32-bit ints). It's value depends on the platform, and the version of C used.

2) For practical reality, you can assume that your processor will support two's complement arithmetic, since most platforms use it. The C language itself does not guarantee that however. Other arithmetic schemes (ones' complement, signed magnitude) are allowed.

ldav1s
  • 15,885
  • 2
  • 53
  • 56
  • Thank you ldav1s! But in my case `Application_Events` is just defined together with the above `enum` type, and the debugger in Xcode tells me that it's of type `int`. – George Apr 23 '13 at 12:47
1
  1. The type for an enum is int.
  2. 0x80000000 is just a number, but in hexadecimal notation. It's value is whatever you confirmed in the debugger (or any hexadecimal to decimal converter).
  3. The way enums work is that the values get assigned incrementally from any explicitly assigned value. So, in this case, the enums are getting assigned as EVENT_EXIT=0x80000000, EVENT_TOUCH=0x80000001, EVENT_DRAG=0x80000002, and so on.
Ziffusion
  • 8,779
  • 4
  • 29
  • 57
  • Thank you Ziffusion! Is there any reference to support your "the type for an `enum` is `int`"? – George Apr 23 '13 at 12:36