1

What will be the output of the following code?

#include<stdio.h>
int main()
{
    int a[4]={4096,2,3,4};
    char *p;
    p=a;
    printf("\n %d",(int)*(++p));
    return 0;
}

sizeof int = sizeof(void*) = 4 bytes

According to me the output should be 16 on a little endian machine and 0 on a big endian machine. Am I right?

Adam Wagner
  • 15,469
  • 7
  • 52
  • 66
rodrigues
  • 13
  • 3
  • it should be a compiler warning....:P – Jon Nov 22 '10 at 12:58
  • Why would you think those are going to be the output? – Pablo Santa Cruz Nov 22 '10 at 12:58
  • On this machine, what is the size of an int? (see http://stackoverflow.com/questions/589575/c-size-of-int-long-etc) Did you mean to cast the input to printf to an int first? Using `(int)*(++p)` or `*((int *)++p)`? – sje397 Nov 22 '10 at 12:59
  • Sizeof int is 4 bytes and sizeof void* is also 4 bytes – rodrigues Nov 22 '10 at 13:01
  • @sje397: technically I think it doesn't matter, at least with `CHAR_BIT==8`. A char varargs argument is promoted either to `int` or to `unsigned int` (6.5.2.2/7), and it's valid to read it as either, since the values "16" and "0" are representable in either (7.15.1.1/2). Might as well put it in, though... – Steve Jessop Nov 22 '10 at 13:16
  • what's the array for? it's only necessary if you meant `*((int *)++p)` as sje397 suggested – Christoph Nov 22 '10 at 14:47

2 Answers2

1

4096 (hex 0x1000) will be represented in memory (assuming 8-bit bytes, which is quite common nowadays) as either:

[0x00, 0x00, 0x10, 0x00] (big-endian)

or

[0x00, 0x10, 0x00, 0x00] (little-endian)

Since you're printing it out using %d, which expects integers, but passing a dereferenced character pointer value (i.e., a char) and incrementing the pointer before dereferencing it, you will print either 0x00 or 0x10, which is 16 in decimal. You will probably need to add some cast to allow the p = a statement, since you're mixing types rather freely.

So yes, I think you're right.

unwind
  • 391,730
  • 64
  • 469
  • 606
1

4096 is 0x1000, so (once you get it to compile) you'd expect 16 with a little-endian representation, and 0 with a big-endian representation unless int is 24 bits wide (that would give 16).

All this assuming CHAR_BIT == 8, and no funny-business with padding bits. The edit with sizeof(int) == 4 also rules out the 24 bit int under those assumptions.

Steve Jessop
  • 273,490
  • 39
  • 460
  • 699