Others have explained to you why this is undefined behaviour - and why to never do this - but if you're still curious as to what's happening here: read on.
With the right compiler, on the right platform, during the right phase of the moon, the following happens.
sizeof(double)
== 8
- Setting
numbers.PI
to 3.14 sets 8 bytes in memory to 1f 85 eb 51 b8 1e 09 40
and displays numbers {B=1374389535 PI=3.1400000000000001}
in the watch window
sizeof(int)
== 4
- So - when you set
numbers.B
to 12, only the first four bytes of that memory are changed. numbers
is now stored as 0c 00 00 00 b8 1e 09 40
(the 0c000000 is 12 on my Intel processor) and the b81e09f0 are what was left behind of PI.
- As it happens, Intel, the IEEE and the compiler union packing order have led to the 12 trashing only the least significant bits of PI's mantissa (without disturbing its exponent).
numbers
is now shown in the watch window as numbers {B=12 PI=3.1399993896484428}
So, numbers.PI
has been modified! It was 3.1400000000000001, and now it is 3.1399993896484428. But when you stream it to cout
its new value gets rounded to 3.14 for display purposes, hiding the fact that its value has changed.
But remember - accessing any member of a union other than the one most recently set is still undefined behaviour.