-1

I had for class to do a code which shows if machine is big or little endian. I made this:

#include <stdio.h>

typedef unsigned char *byte_pointer;

int show_bytes(int x) { 
    byte_pointer a = (byte_pointer)&x; 
    int i;
    printf("%c", a[0]); // why here when printing a[0] does not give the ascii representation?
    if (a[0] == 0x01)
       return 1; 
   return 0;  
}

int isEndian() {
    int i = 1; 
    return show_bytes(i);
}

int main(int argc, char **argv){
    int a = isEndian();
    printf("%d", a);
}

And I do not understand why when printing back the char[0] on a little endian machine, does not print the unicode representation of 1?

chqrlie
  • 131,814
  • 10
  • 121
  • 189
Mr N
  • 53
  • 9
  • 1
    The int value you're passing to show_bytes is 1. S you'll get ASCII character 1 'printed'. You probably wanted to set `int i='1';`. – einpoklum Feb 08 '22 at 11:39
  • 1
    Unicode or ASCII character #1 is a control code, without necessarily any visible representation (depending on your terminal). – interjay Feb 08 '22 at 11:40
  • Assuming ASCII, the value `1` is the `SOH` (Start of Header) character. Not the character `'1'` (which is ASCII `49`). – Some programmer dude Feb 08 '22 at 11:41
  • Ok, thank you everyone for the answers. – Mr N Feb 08 '22 at 11:45
  • BTW, there is a good illustration for how endian detection works [here](https://stackoverflow.com/a/12792301/645128) – ryyker Feb 08 '22 at 14:02
  • See [this page](https://commandcenter.blogspot.com/2012/04/byte-order-fallacy.html) for a nice explanation of why you often *don't* need to try to figure out whether your machine is big- or little-endian. – Steve Summit Feb 08 '22 at 14:35

1 Answers1

1

The bytes making up an integer in memory have nothing to do with the bytes representing the ASCII (or Unicode) characters that would serve as a human-readable decimal representation of that integer.

If I say

int i = 1234;

the bytes representing it in memory are either

d2 04

or

d2 04 00 00

(using little-endian order, and depending on whether type int has 16 or 32 bits).

But the bytes representing a decimal string representation (such as I'd get with sprintf(buf, "%d", i)) are

31  32  33  34
'1' '2' '3' '4'
Steve Summit
  • 45,437
  • 7
  • 70
  • 103
  • 2
    The locations with zeros are also part of the memory representation: `00 00 d2 04` ? (and would help to show clearly that the test will see zeros for one, and non-zeros for the other) (+1) – ryyker Feb 08 '22 at 14:09
  • May I ask, how for example char c = '3'; is represented in memory, I hope it is not a stupid questions, but I can't find the answer on the internet. I understand that chars are ascii representation, but then what is the byte value of c in this case. – Mr N Feb 09 '22 at 20:09
  • @MrN The byte value is exactly the ASCII value: `0x33`, or 51 decimal. You can easily see this yourself: just do `printf("%d\n", c);` or `printf("%x\n", c);`. – Steve Summit Feb 09 '22 at 20:16