I know violating the strict-aliasing rule is Undefined Behavior as per the C standard. Please don't tell me it is UB and there is nothing to talk about.
I'd like to know if there are compilers which won't have the expected behavior (defined by me below) for the following code.
Assume the size of float
and int
is 4 bytes, and a big-endian machine.
float f = 1234.567; /* Any value here */
unsigned int u = *(unsigned int *)&f;
My expected behavior in english words is "get the four bytes where the float
is stored and put them in an int
as is". In code it would be this (I think there is no UB here):
float f = 1234.567; /* Any value here */
unsigned char *p = (unsigned char *)&f;
unsigned int u = (p[0] << 24) | (p[1] << 16) | (p[2] << 8) | p[3];
I'd also welcome practical and concrete examples of why, apart from being UB as per the standard, a compiler would have what I consider an unexpected behavior.