1

This is the c code:

int main(int argc, const char *argv[])
{
    printf("%d", '0/');
    return 0;
}

The output is 12335! Then I try to replace '0/' with '00' and '000', and the outputs change to 12336 and 3158064, while 12336=48*(1+2^8), 3158064=48*(1+2^8+2^16). However, I still don't know why. What happens when '0/' is transformed to an integer for output?

PS: My computer is MBP, and the operating system is OS X 10.9.5 (13F34). The compiler is Apple LLVM 6.0.

Peng Liu
  • 11
  • 1
  • Even though a basic question, I would have given a +1 for a nearly well-formed first time question if OP had added one more thing: the expected output. Consider that for future questions. – chux - Reinstate Monica Sep 27 '14 at 23:34

3 Answers3

3

You have constructed a "multi-character literal". The behaviour is implementation-defined, but in your case, the integer value is constructed from the ASCII values (12235 == 48 * 256 + 47).

Oliver Charlesworth
  • 267,707
  • 33
  • 569
  • 680
2

'0/' is a multi-character constant, which means it has an implementation-defined value. In your case, the ASCII value of the characters is 0x30 0x2f. These are combined into 0x302f, which equals 12335.

Drew McGowen
  • 11,471
  • 1
  • 31
  • 57
2

Because 0/ is a multi-character constant of type int. You initialize the first part of a 2x3 buffer with it, and pass it to printf to be re-interpreted as an int, the first byte '0' is multiplied by 256 and then the second byte '/' is added to it. This produces the value that you see:

printf("%d %d %d", '0', '/', '0'*256+'/');

prints

48 47 12335

demo.

Note that this behavior is system-dependent. On other systems you could see 12080 instead of 12335.

See this answer for more information on multicharacter constants.

Community
  • 1
  • 1
Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523