This is the c code:
int main(int argc, const char *argv[])
{
printf("%d", '0/');
return 0;
}
The output is 12335! Then I try to replace '0/' with '00' and '000', and the outputs change to 12336 and 3158064, while 12336=48*(1+2^8), 3158064=48*(1+2^8+2^16). However, I still don't know why. What happens when '0/' is transformed to an integer for output?
PS: My computer is MBP, and the operating system is OS X 10.9.5 (13F34). The compiler is Apple LLVM 6.0.