According to c_faq,
typedef struct {
char a[3];
short int b;
long int c;
char d[3];
} T;
sizeof(T)
will return me 16 on a 32-bit gcc compiler. Because the alignment is 4, so to calculate, I should do this: 3+(1)+2+(2)+4+3+(1)
, where the brackets represent paddings. GCC concurs.
Based on that logic, did some practices on my own with another question.
typedef struct{
char buf1[9];
unsigned short s;
double d;
char buf2[3];
} S;
sizeof(S)
should return me 32 on a 32-bit gcc compiler. Because the alignment is 8, so to calculate, I should do this: 8+1+2+(5)+8+3+(5)
. However, gcc is telling me that sizeof(S)
is 24.
I reckoned it's because of optimization, however, after messing with the -O0
flag in gcc, sizeof(S)
still results in 24.
I was using gcc -m32 main.c
to compile.
What is going on here?
If I understood properly, the reason why my calculation and GCC's calculation do not match is because each compiler has their own way of handling struct data. Then what is a universal way of calculating the size of a struct? Or rather, what is the original way of calculating the size of a struct?