C standard uses the word byte in many different places. Mostly it is something very similar to my understanding of this word - 8 bits long chunk of data.
But :
The sizeof operator yields the size (in bytes) of its operand
And:
When sizeof is applied to an operand that has type char, unsigned char, or signed char, (or a qualified version thereof) the result is 1
Later:
When applied to an operand that has array type, the result is the total number of bytes in the array.
So if we consider the machine with char
having more than 8 bits the observable behavior of this program will be differ from the 8bits char machine.
char foo[5];
for(size_t index = 0; index < sizeof(foo) / sizeof(char); index++)
{
/* some code */
}
So maybe the byte meaning is different in the C standard understanding. Could anyone explain: is byte 8 bits or byte is something different
And one extra question.
is sizeof(char) == sizeof(array[0])
? Considering the byte size differences