0

Consider this code

NSInteger q = 2048;
BOOL boolQ = q;

NSLog(@"%hhd",boolQ);

After execution boolQ is equal 0. Could someone explain why is this so?

haawa
  • 3,078
  • 1
  • 26
  • 35

2 Answers2

3

BOOL probably is implemented as char or uint8_t/int8_t, as "hh" prints half of the half of an integer. which typically is a byte.

Converting to char is taking the lowest 8bit of 2048 (=0x800) and gives you 0.

The proper way to convert any integer to a boolean value is:

NSInteger q = some-value;
BOOL b = !!q;
alk
  • 69,737
  • 10
  • 105
  • 255
  • this makes sense, but if u debug it you will see that BOOL value is actually NO, I added NSLog hdd just to print it. If you stop on breakpoint above NSLog line you can inspect that boolQ is NO. – haawa May 13 '14 at 12:29
  • As per this answer `http://stackoverflow.com/a/544250/694576` `NO` is just a `#define` for `(BOOL) 0`. @haawa – alk May 13 '14 at 12:51
0

Casting an integer value to a type too small to represent the value being converted is undefined behaviour in C (C11 standard Annex J.2), and therefore also in the part of Objective-C which deals with C-level matters. Since it's undefined behaviour it can represent the result however it wants, expected value or not.

As per 6.3.1.4, any integer can be used as a boolean value without casting, in which case it will show the expected behaviour (0 is 0, everything else is 1), giving rise to the !! idiom suggested by alk; perhaps counterintuitively, you convert the value by not explicitly converting the value (instead, the conversion is correctly handled by the implicit conversion operation inserted by the ! operator).

Alex Celeste
  • 12,824
  • 10
  • 46
  • 89