This program returns 0
in my machine:
#include <stdbool.h>
union U {
_Bool b;
char c;
};
int main(void) {
union U u;
u.c = 3;
_Bool b = u.b;
if (b == true) {
return 0;
} else {
return 1;
}
}
AFAICT, _Bool
is an integer type that can at least store 0
and 1
, and true
is the integral constant 1
. On my machine, _Bool
has a sizeof(_Bool) == 1
, and CHAR_BITS == 8
, which means that _Bool
has 256 representations.
I can't find much in the C standard about the trap representations of _Bool
, and I can't find whether creating a _Bool
with a representation different from 0
or 1
(on implementations that support more than two representations) is ok, and if it is ok, whether those representations denote true or false.
What I can find in the standard is what happens when a _Bool
is compared with an integer, the integer is converted to the 0
representation if it has value 0
, and to the 1
representation if it has a value different than zero, such that the snippet above ends up comparing two _Bool
s with different representations: _Bool[3] == _Bool[1]
.
I can't find much in the C standard about what the result of such a comparison is. Since _Bool
is an integer type, I'd expect the rules for integers to apply, such that the equality comparison only returns true if the representations are equal, which is not the case here.
Since on my platform this program returns 0
, it would appear that this rule is not applying here.
Why does this code behave like this ? (i.e. what am I missing? Which representations of _Bool
are trap representations and which ones aren't? How many representations can represent true
and false
? What role do padding bits play into this? etc. )
What can portable C programs assume about the representation of _Bool
?