First, you have to understand that the x86 architecture is what is called little-endian. This means that in multibyte variables, the bytes are ordered in memory from least to most significant. If you don't understand what that means, it'll become clear in a second.
A char
is 8 bits -- one byte. When you store 'A'
into one, it gets the value 0x41
and is happy. An int
is larger; on many architectures it is 32 bits -- 4 bytes. When you assign the value 'A'
to an int, it gets the value 0x00000041
. This is numerically exactly the same, but there are three extra bytes of zeros in the int
.
So your int
contains 0x00000041
. In memory, that is arranged in bytes, and because you're on a little-endian architecture, those bytes are arranged from least to most significant -- the opposite of how we normally write them! The memory actually looks like this:
+----+----+----+----+
int: | 41 | 00 | 00 | 00 |
+----+----+----+----+
+----+
char: | 41 |
+----+
When you take a pointer to the int
and cast it to a char*
, and then dereference it, the compiler will take the first byte of the int
-- because char
s are only one byte wide -- and print it out. The other three bytes get ignored! Now look back and notice that if the order of the bytes in the int
were reversed, as on a big-endian architecture, you would have retrieved the value zero instead! So the behavior of this code -- the fact that the cast from int*
to char*
worked as you expected -- was strictly dependent on the machine you were running it on.
On the other hand, when you take a pointer to the char
and cast it to an int*
, and then defererence it, the compiler will grab the one byte in the char
as you'd expect, but then it will also read three more bytes past it, because int
s are four bytes wide! What is in those three bytes? You don't know! Your memory looks like this:
+----+
char: | 41 |
+----+
+----+----+----+----+
int: | 41 | ?? | ?? | ?? |
+----+----+----+----+
You get a garbage value in your int
because you're reading memory that is uninitialized. On a different platform or under a different planetary alignment, your code might work perfectly fine, or it might segfault and crash. There's no telling. This is what is known as undefined behavior, and it is a dangerous game that we play with our compilers. We have to be very careful when working with memory on like this; there's nothing scarier than nondeterministic code.