1

I have been told that BOOL in Objective-C is a typedef of an unsigned char and YES & NO keywords are encoded chars. This is not the first time I heard it. I have read that this is because Apple used BOOL before the C standard provided a _Bool type, am I wrong? Is there any advantage of that fact? Are we wasting bits of memory? Does this provide a way to return valuable data in a function? Would it be correct to use it as a way to return a different values when some unexpected behavior occurs?

BOOL myFunction(int argument)
{
    BOOL result = YES; //The function generates the result
    if (someError == YES) {
        return 5;
    }
    return result;
}
Lluís
  • 578
  • 1
  • 5
  • 10

3 Answers3

5

Are we wasting bits of memory?

No, because you can't get a variable smaller than a char: it's always a single byte. You can pack multiple bits representing boolean flags in a single word, but you have to do it manually - with bit shifts, using bit fields, and so on.

Does this provide a way to return valuable data in a function?

Not really: what you did is a hack, although 5 would definitely make its way through the system to the caller, and would be interpreted as YES in a "plain" if statement, e.g.

if (myFunction(123)) {
    ...
}

However, it would fail miserably if used like this:

if (myFunction(123) == YES) { // 5 != YES
    ...
}

Would it be correct to use it as a way to return a different values when some unexpected behavior occurs?

It would always be incorrect from the readability point of view; as far as "doing what you meant it to do", your mileage may vary, depending on the way in which your function is used.

Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523
  • Might as well just return an int and be done with it. Thankyou for highlighting the readability aspect. It's a pain to maintain code from someone who has been 'clever' – Bergasms Jan 14 '13 at 22:03
  • 1
    Te way arround the problem: `if (myFunction(123) == YES)` is `if (!!myFunction(123) == YES)`. You will occasionally see the double negation used to convert a non-zero number to a `BOOL` `YES` value. – zaph Jan 14 '13 at 22:27
  • 1
    @Zaph Yes - that's the way really defensive programmers shield themselves against hacks of people who get fancy :) – Sergey Kalinichenko Jan 14 '13 at 22:28
1

There is a slight advantage: On many platforms (including iOS, IIRC), sizeof(_Bool) == sizeof(int), so using char can be slightly more compact.

Except BOOL is actually signed char, not char, this is so @encode(BOOL) evaluates to the same thing on all platforms. This complicates bitfields slightly, since BOOL foo:1; appears to define a 1-bit signed integer (IIRC the behaviour of which is undefined) — clearly unsigned char would be a better choice, but it was probably too late.

_Bool also ought to optimize better since the compiler can make assumptions about the bit-pattern used, e.g. replacing a&&b with a&b (provided b is side effect-free). Some architectures also represent "true" as all bits set, which is useful for masking (SSE comparison instructions come to mind).

tc.
  • 33,468
  • 5
  • 78
  • 96
0

"BOOL in Objective-C" is not an unsigned char, it's whatever the Objective-C library defines it to be. Which is unsigned char or bool, depending on your compiler settings (32 bit or 64 bit). Both behave different. Try this code with a 32 bit compiler and a 64 bit compiler:

BOOL b = 256;
if (b) NSLog (@"b is true"); else NSLog (@"b is false");
gnasher729
  • 51,477
  • 5
  • 75
  • 98