What does the *(int*)&
do that makes bitset work?
That's a sequence of three unary operators, *
, (int *)
, and &
. As used in the code presented, *(int*)&f
first computes a pointer to f
(of type float *
), then the C-style cast converts that to type int *
, and finally the *
dereferences that to get a value of type int
. Overall, this ...
int x = *(int*)&f;
... is roughly equivalent to ...
int x = reinterpret_cast<int&>(f);
... including that both elicit undefined behavior. But the intention is that the representation of f
's value is interpreted as an int
.
how it have any relations to the bitset?
std::bitset
has a constructor that accepts an integer argument, which initializes the bitset from the bits of the argument's value. There is no such constructor that accepts a float
, and a normal value conversion from type float
to type int
does not preserve the bit pattern of the float
.
It would be better to use a union
to perform such a conversion:
union { float f; int x; } u;
cin >> u.f;
cout << u.x << "\n";
bitset<sizeof(int) * 8> binary(u.x);
cout << binary;
In C, you could use a union for that, but in C++, you need another solution, such as memcpy()
:
float f;
int x;
cin >> f;
memcpy(&x, &f, sizeof(x));
cout << x << "\n";
bitset<sizeof(int) * 8> binary(x);
cout << binary;