When I compile and run the following code:
struct preferences {
bool likesMusic : 1;
bool hasHair : 1;
bool hasInternet : 1;
bool hasDinosaur : 1;
unsigned int numberOfChildren : 4;
};
int main() {
struct preferences homer;
homer.likesMusic = true;
homer.hasHair = false;
homer.hasInternet = true;
homer.hasDinosaur = false;
homer.numberOfChildren = 3;
printf("sizeof int: %ld bits\n", sizeof(int) * 8);
printf("sizeof homer: %ld bits\n", sizeof(homer) * 8);
printf("sizeof bool: %ld bits, sizeof unsigned int: %ld bits\n", sizeof(bool) * 8 , sizeof(unsigned int) * 8);
}
The output is:
sizeof int: 32 bits
sizeof homer: 32 bits
sizeof bool: 8, sizeof unsigned int: 32.
When I comment out the numberOfChildren field from preferences, the output is:
sizeof int: 32 bits
sizeof homer: 8 bits
sizeof bool: 8, sizeof unsigned int: 32
This does not make sense to me because the size of unsigned int is deduced to be 24 bits from this experimental run as the size of homer struct got reduced from 32 to 8 bits after the removal of numberOfChildren field. Also, the output itself contains sizeof unsigned int: 32.
Any insight you can provide would be appreciated. Thanks.
My setup is Ubuntu 20.04, 64-bits, Intel architecture Xeon Silver 16-core processor.