0

I have a character '¿'. If I cast it with integer in C, result is -61 and same casting in C#, result is 191. Can someone explain me the reason.

C Code
char c = '¿';
int I = (int)c;
Result I = -62


C# Code
char c = '¿';
int I = (int)c;
Result I = 191
Varun Gupta
  • 1,419
  • 6
  • 28
  • 53
  • 2
    C# does not use ANSI but Unicode (UTF-16). – 500 - Internal Server Error Jan 03 '15 at 22:15
  • C# `char` is equivalent to C `wchar_t`, not `char`. `char` on your compiler couldn't contain the value 191 even if it wanted to (allowable range -128 through +127) – Ben Voigt Jan 03 '15 at 22:15
  • Note that "C code" is implementation dependent behavior - there is no requirement for `char` to be signed. See [Why is 'char' signed by default in C++?](http://stackoverflow.com/questions/17097537/why-is-char-signed-by-default-in-c) – Alexei Levenkov Jan 03 '15 at 22:39
  • unsigned char can contains 0 ... 255. and -61 is a signed version of unsigned 191. – user3629249 Jan 04 '15 at 02:47

1 Answers1

0

This is how singed/unsigned numbers are represented and converted.

It looks like C compiler's default in your case use signed byte as underlying type for char (since you are note explicitly specifying unsigend char compiler's default is used, See - Why is 'char' signed by default in C++? ).

So 191 (0xBF) as signed byte means negative number (most significant bit is 1) - -65.

If you'd use unsigned char value would stay positive as you expect.

If your compiler would you wider type for char (i.e. short) that 191 would stay as positive 191 irrespective of whether or not char is signed or not.

In C# where it always unsigned - see MSDN char:

Type: char
Range: U+0000 to U+FFFF

So 191 will always convert to to int as you expect.

Community
  • 1
  • 1
Alexei Levenkov
  • 98,904
  • 14
  • 127
  • 179