2
#include <iostream>
using namespace std;

int main(){
    int x = 0x414243;
    cout.write( (char*)&x, 1);
    cout.write( ((char*)&x) + 1, 2);
}

The output is:

CBA

I don't understand what (char*)& is doing with x.

Looking at this ASCII table http://www.asciitable.com/, it seems to me write() is writing 141, 142, 143, in octal... in reverse!

How is char* managing to do this?

calm-tedesco
  • 256
  • 5
  • 12
  • Duplicate of https://stackoverflow.com/questions/22030657/little-endian-vs-big-endian ? – François Andrieux Jun 12 '17 at 15:05
  • `(char*)&x` is first taking the memory location of x (`&x`) and then casting it as a char pointer `(char*)(&x)` – Easton Bornemeier Jun 12 '17 at 15:07
  • @FrançoisAndrieux I have no idea what 'endianness' is and I don't see how it is related to this – calm-tedesco Jun 12 '17 at 15:08
  • 1
    @Alberto endianness determines the byte order used to encode some types of values. As other answers have pointed it, you are likely on a little endian system, where the least significant bytes come first. Perhaps reading the answers from the posted link will help clarify things. – François Andrieux Jun 12 '17 at 15:16

2 Answers2

4

ASCII codes for upper case 'C', 'B', and 'A' are 67, 66, and 65, i.e. 0x43, 0x42, and 0x41.

It looks like your computer is 32-bit little-endian, so the octets of 0x00414243 (two extra zeros are for clarity, to complete 32-bit int) are placed in memory as follows:

0x43, 0x42, 0x41, 0x00

This represents a null-terminated string "CBA".

Note that on a big-endian hardware the octets would be placed in reverse order, i.e.

0x00, 0x41, 0x42, 0x43

so interpreting this number as a null-terminated string would produce empty output.

Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523
0

The integer type on your platform is most likely more than one byte or char.

Since it is more than one char in length, it can be ordered with Least Significant byte first (a.k.a. Little Endian) or Most Significant byte first (Big Endian).

The expression (char *) x is a pointer to the first character of the integer.

Thus, it could be 0x00 or 0x43, depending on the byte ordering for your platform (assuming that x is 4 bytes in length).

Thomas Matthews
  • 56,849
  • 17
  • 98
  • 154