4

I was looking through the NSString header file to see how Apple writes their enumerations and came across this piece of code:

enum {
    NSStringEncodingConversionAllowLossy = 1,
    NSStringEncodingConversionExternalRepresentation = 2
};
typedef NSUInteger NSStringEncodingConversionOptions;

This leaves me with a couple questions.

  1. Why have they used anonymous enumerations? Is this method advantageous?
  2. Is the typedef NSUInteger NSStringEncodingConversionOptions; line a good idea to include normally, or is it only used here because they have declared an anonymous enumeration?
BoltClock
  • 700,868
  • 160
  • 1,392
  • 1,356
FreeAsInBeer
  • 12,937
  • 5
  • 50
  • 82

1 Answers1

5

That strange-looking definition is there to clearly define the bit-width and the signed-ness of an enum in a code both in 64-bit and 32-bit environment. It's detailed in this Apple document, but let me write down the summary here.

Apple in the past used the standard typedef enums before, as in

typedef enum { .... } NSEnumTypeName;

before 64bit-32bit universal binaries were (re)introduced. (I used "re" because the FAT binaries have been there since the NeXTStep days. Anyway.)

However, this makes the bit-width and the signed-ness of the typedef'd type NSEnumTypeName to be implementation-defined as specified in the Official Standard, see 6.7.2.2.4.

That makes it even trickier to write a code which can be compiled with various compilers and with various bit-width.

So Apple switched from the standard enum to the anonymous enum with a corresponding typedef to a specific signed/unsigned integer type.

Yuji
  • 34,103
  • 3
  • 70
  • 88