2

I'm getting introduced to objective C and have a mild understanding of enum types.

Here is a piece of sample code seen here used in the tutorial I'm following:

UIFont *bodyFont = [UIFont preferredFontForTextStyle:UIFontTextStyleBody];
UIFontDescriptor *existingDescriptor = [bodyFont fontDescriptor];
UIFontDescriptorSymbolicTraits traits = existingDescriptor.symbolicTraits;
traits |= UIFontDescriptorTraitBold;
UIFontDescriptor *newDescriptor = [existingDescriptor fontDescriptorWithSymbolicTraits:traits];
UIFont *boldBodyFont = [UIFont fontWithFontDescriptor:newDescriptor size:0];

From what I understand, bodyFont is set using a class method of UIFont then existingDescriptor is created by extracting that from bodyFont. The existing UIFontDescriptorSymbolicTraits are then extracted from that and stored in traits.

I do not understand what follows after that (traits |= UIFontDescriptorBold;) From googling, I understand that it is a bit wise comparison and then assignment, but I'm not sure why it has to be done his way. Going to my next question.

From the API for UIFontDescriptor.h (https://developer.apple.com/library/ios/documentation/uikit/reference/UIFontDescriptor_Class/Reference/Reference.html#//apple_ref/doc/c_ref/UIFontDescriptorSymbolicTraits)

typedef enum : uint32_t {
   /* Typeface info (lower 16 bits of UIFontDescriptorSymbolicTraits) */          
   UIFontDescriptorTraitItalic = 1u << 0,
   UIFontDescriptorTraitBold = 1u << 1,
   UIFontDescriptorTraitExpanded = 1u << 5,
   UIFontDescriptorTraitCondensed = 1u << 6,
   UIFontDescriptorTraitMonoSpace = 1u << 10,
   UIFontDescriptorTraitVertical = 1u << 11,
   UIFontDescriptorTraitUIOptimized = 1u << 12,
   UIFontDescriptorTraitTightLeading = 1u << 15,
   UIFontDescriptorTraitLooseLeading = 1u << 16,

   /* Font appearance info (upper 16 bits of UIFontDescriptorSymbolicTraits */
   UIFontDescriptorClassMask = 0xF0000000,

   UIFontDescriptorClassUnknown = 0u << 28,
   UIFontDescriptorClassOldStyleSerifs = 1u << 28,
   UIFontDescriptorClassTransitionalSerifs = 2u << 28,
   UIFontDescriptorClassModernSerifs = 3u << 28,
   UIFontDescriptorClassClarendonSerifs = 4u << 28,
   UIFontDescriptorClassSlabSerifs = 5u << 28,
   UIFontDescriptorClassFreeformSerifs = 7u << 28,
   UIFontDescriptorClassSansSerif = 8u << 28,
   UIFontDescriptorClassOrnamentals = 9u << 28,
   UIFontDescriptorClassScripts = 10u << 28,
   UIFontDescriptorClassSymbolic = 12u << 28
} UIFontDescriptorSymbolicTraits;

What is the meaning of the notation enum : uint32_t? I know the use of enum and I somewhat know that uint32_t means unsigned 32 bit integer (though I'm not sure how it differs from a normal unsigned int).

Another question why are the values created as shifted bits instead of just normal integers? Why do some values skip bits or numbers (eg UIDescriptorClassSlabSerifs goes from 5u << 28 to 7u << 28. Or UIFontDescriptorTraitBold 1u<<1 to 1u<<5)?

Please let me know if my questions need further explanation.

geeves31
  • 35
  • 6
  • Googling is not the solution, studying a good "C" language book is really a necessity. – zaph Jul 17 '14 at 01:56

2 Answers2

2

The : uint32_t specifies the size of the storage that's used for variables of this type. uint_32t means that regardless of architecture, you have exactly 32 bits of information. It's unsigned because bit twiddling on signed integers can produce unexpected results.

The values are specified this way to make it clear that they are being used as composable flags; a value stored in a variable of this type contains many pieces of information. It's a lot easier to read and write 1u << 5 1u << 6 than translate from decimal in your brain. The skipped bits are either to allow for future expansion or to group related flags, again for readability.

The |= operator is not comparison, it's assignment. It's similar to +=, which adds the right-hand operand to the left-hand and stores the result back in the latter. In this case, it does a bitwise OR, which sets the bits specified on the right-hand side. This is how you add flags to a bitmask.

Community
  • 1
  • 1
jscs
  • 63,694
  • 13
  • 151
  • 195
  • Thanks for the response. I think this answer combined with the link I provided below helped clear up the picture a little more. – geeves31 Jul 17 '14 at 02:33
  • can you explain the notation enum : uint32_t? Why not do enum uint32_t without the : or enum unsigned int without the :? Or does this again have to do with uint32_t specifies exactly 32 bits whereas unsigned int is just >= 32 bits. – geeves31 Jul 17 '14 at 02:37
  • The reason for the choice of the colon in this syntax is not really known, although `thing:type` is used in other languages. `uint32_t` is defined to always be 32 bits wide. `unsigned int` is at least 16 bits but may be bigger depending on the platform. – jscs Jul 17 '14 at 02:42
  • Got it! Thanks for all your help. I would up vote, but I don't have enough reputation yet to do so, so I'll just mark as solved. – geeves31 Jul 17 '14 at 03:50
1

traits |= UIFontDescriptorBold is not a comparison, just bit or-ing the value of UIFontDescriptorBold into the value of traits. Basic "C".

enum : uint32_t is an Apple extension that causes the enumeration to use 32-bit unsigned values uint32_t.

1u << 5 and the like are used to create bit values with one bit set. This allows the various options to be used in combination to specify a set of options. Integer values could be specified but there would be a larger chance of an error and less clear. Not all values need be used, perhaps sone are private or earlier used any later removed or another arbitrary reason.

zaph
  • 111,848
  • 21
  • 189
  • 228
  • `typedef unsigned int uint32_t;` These are "C" types, there is a whole set unsigned and signed, typically found in the header file `stdint.h`. The number is the number of bits of the storage. Having a good knowledge of "C" "will do you no harm" <- Richie Havens. – zaph Jul 17 '14 at 02:41