This would only work on platforms where sizeof(int) > 1
. As an example, we'll assume it's 2, and that a char
is 8 bits.
Basically, with little-endian, the number 1 as a 16-bit integer looks like this:
00000001 00000000
But with big-endian, it's:
00000000 00000001
So first the code sets a = 1
, and then this:
*( (char*)&a ) == 1)
takes the address of a
, treats it as a pointer to a char
, and dereferences it. So:
If a
contains a little-endian integer, you're going to get the 00000001
section, which is 1 when interpeted as a char
If a
contains a big-endian integer, you're going to get 00000000
instead. The check for == 1
will fail, and the code will assume the platform is big-endian.
You could improve this code by using int16_t
and int8_t
instead of int
and char
. Or better yet, just check if htons(1) != 1
.