You're streaming char
s. These get automatically ASCII-ised for you by IOStreams*, so you're seeing (or rather, not seeing) unprintable characters (in fact, all 0x01 bytes).
You can cast to int
to see the numerical value, and perhaps add std::hex
for a conventional view.
Example:
#include <iostream>
#include <iomanip>
int main()
{
int a = 0;
// Alias the first four bytes of `a` using `char*`
char* x1 = (char*)&a;
char* x2 = x1 + 1;
char* x3 = x1 + 2;
char* x4 = x1 + 3;
*x1 = 1;
*x2 = 1;
*x3 = 1;
*x4 = 1;
std::cout << std::hex << std::setfill('0');
std::cout << '@' << std::setw(2) << "0x" << (int)*x1
<< ' ' << std::setw(2) << "0x" << (int)*x2
<< ' ' << std::setw(2) << "0x" << (int)*x3
<< ' ' << std::setw(2) << "0x" << (int)*x4
<< '@' << '\n';
std::cout << "0x" << a << '\n';
}
// Output:
// @0x01 0x01 0x01 0x01@
// 0x1010101
Those saying that your program has undefined are incorrect (assuming your int
has at least four bytes in it); aliasing objects via char*
is specifically permitted.
The 16843009
output is correct; that's equal to 0x01010101
which you'd again see if you put your stream into hex mode.
N.B. Some people will recommend reinterpret_cast<char*>(&a)
and static_cast<int>(*x1)
, instead of C-style casts, though personally I find them ugly and unnecessary in this particular case. For the output you can at least write +*x1
to get a "free" promotion to int
(via the unary +
operator), but that's not terribly self-documenting.
* Technically it's something like the opposite; IOStreams usually automatically converts your numbers and booleans and things into the right ASCII characters to appear correct on screen. For char
it skips that step, assuming that you're already providing the ASCII value you want.