What is implied, but not really stated, in the other answers is that int
and char
can be treated as fairly equivalent in c#. You can do math on chars just like you can on ints, and you can convert back and forth between int and char with ease. The number you get is based on the position in the utf8 character tables for the char.
'0' as a char has an int value of 48, so you could do:
int x = 7;
char c = x+48; //c would be '7' as a char, or 55 as an int
Other examples:
char c = 'a';
c++;
Console.Write(c); //prints 'b', because 'a' + 1 is b
It's quite logical and reasonably helpful sometimes* but the main reason you might see '0' is that it's easier to remember '0' than it is to remember 48 (it's slightly easier to remember the hex version 0x30)
All these give you char 5
from int 5:
char five = 5 + 48;
char five = 5 + 0x30;
char five = 5 + '0';
Which one would you find easiest to remember? :)
*for example, say you wanted to count the chars in an ascii string, you could do:
var counts = new int[256];
foreach(char c in string)
counts[c]++;
You can use the char to index the array just like you can an int. At the end of the operation "hello world" would have put a 1 in index 104 (the h
), a 3 in index 108(the l
) etc..
Sure these days you might use a Dictionary<char, int>
but appreciating that intrinsic char/int equivalence and how it can be used has its merits..