Consider this code
NSInteger q = 2048;
BOOL boolQ = q;
NSLog(@"%hhd",boolQ);
After execution boolQ is equal 0. Could someone explain why is this so?
Consider this code
NSInteger q = 2048;
BOOL boolQ = q;
NSLog(@"%hhd",boolQ);
After execution boolQ is equal 0. Could someone explain why is this so?
BOOL probably is implemented as char
or uint8_t
/int8_t
, as "hh"
prints half of the half of an integer. which typically is a byte.
Converting to char
is taking the lowest 8bit of 2048
(=0x800
) and gives you 0
.
The proper way to convert any integer to a boolean value is:
NSInteger q = some-value;
BOOL b = !!q;
Casting an integer value to a type too small to represent the value being converted is undefined behaviour in C (C11 standard Annex J.2), and therefore also in the part of Objective-C which deals with C-level matters. Since it's undefined behaviour it can represent the result however it wants, expected value or not.
As per 6.3.1.4, any integer can be used as a boolean value without casting, in which case it will show the expected behaviour (0 is 0, everything else is 1), giving rise to the !!
idiom suggested by alk; perhaps counterintuitively, you convert the value by not explicitly converting the value (instead, the conversion is correctly handled by the implicit conversion operation inserted by the !
operator).