9

I have to analyse the output of these code fragments:

int x, y;
x = 200; y = 100;
x = x+y; y = x-y; x = x-y;
printf ("%d %d\n", x, y);

char x, y;
x = 200; y = 100;
x = x+y; y = x-y; x = x-y;
printf ("%d %d\n", x, y);

So, I know now that int stands for integer and char for character; I've read about the differences and if I put in the printf the %d, it returns in the form of digits, and %c, in the form of a character.

The ASCII character code for 'A' is 65 for example, but why does the second function print 100 -56, instead of 100 200?

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
Souza
  • 1,124
  • 5
  • 19
  • 44

2 Answers2

11

On the platform used in the question, the type char seems to be 1 byte (8 bits) size and is a signed type with 1 sign bit and 7 value bits (and using 2's complement arithmetic). It stores values from -128 to 127. So, this is what's happening to x and y:

x = 200 => x takes value -56
y = 100 => y takes value 100
x = x+y => x takes value 44
y = x-y => y takes value -56
x = x-y => x takes value 100
Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
ssantos
  • 16,001
  • 7
  • 50
  • 70
  • Why not -127 to 128? btw, thank you for your clarification, makes sense now. – Souza Apr 08 '13 at 02:03
  • @Souza, Use the leftmost bit as the sign bit and keep adding 1 on paper. That's not guaranteed by C (only a range of [-127,127] is guaranteed for `signed char`), but that's what is really common today. The term to look for is two's complement. – chris Apr 08 '13 at 02:04
  • 5
    chars aren't necessarily 8 bits, no are they necessarily signed. It would be better if your answer were prefixed with "Apparently, in the C environment you are using, ..." – Jim Balter Apr 08 '13 at 02:06
9

C has a variety of integer types: char (at least 8 bits), short (at least 16 bits), int (at least 16 bits), long (at least 32 bits). There are unsigned varieties of those. If you assign a value that is too large to a plain type, the results are undefined (you should never do that, the compiler may assume you never do, and not check at all). In the unsigned case, they "wrap around". But note that the sizes are not guaranteed, just their minimal sizes. There have been machines in which all were 32 bits wide.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
vonbrand
  • 11,412
  • 8
  • 32
  • 52
  • @chris That has nothing to do with this answer. A `char` can be 32 bits wide and `sizeof(char)` is *still* 1. – Jim Balter Apr 08 '13 at 02:08
  • @JimBalter, Yeah, I was just thinking about that. Even though I know it's not true, my mind rushed to one byte being eight bits. – chris Apr 08 '13 at 02:09