First of all, *((unsigned *)&G[0])
causes undefined behaviour by violating the strict aliasing rule. In Standard C it is not permitted to access memory of one type by using a different type, except for a handful of special cases.
You can fix this either by disabling strict aliasing in your compiler, or using a union
or memcpy
.
(Also your code is relying on unsigned
being the same size as float
, which is not true in general).
But supposing you did fix those issues, your code is testing the most-significant bit. In the IEEE 32-bit floating point format, that bit is the sign bit. So it will read 0
for positive numbers and 1
for negative numbers.
The last bit of the mantissa would be the least significant bit after reinterpreting the memory as integer.
Corrected code could look like:
unsigned u;
assert( sizeof u == sizeof *G );
memcpy(&u, G, sizeof u);
printf("%u", u & 1);
NB. I would be hesitant about assuming this bit will be "random", if you want a random distribution of bits there are much better options.