1
#include <stdio.h>

int main(void) {
  int nr = 5;
  char castChar = (char)nr;
  char realChar = '5';
  printf("The value is: %d\n", castChar);
}

If the above code is compiled, the output will be:

The value is: 5

But if the code below is compiled, the console will output the value 53 instead. Why doesn't it print the same as the when the "castChar" is printed?

#include <stdio.h>

int main(void) {
  int nr = 5;
  char castChar = (char)nr;
  char realChar = '5';
  printf("The value is: %d\n", realChar);
}
Clifford
  • 88,407
  • 13
  • 85
  • 165
  • https://en.wikipedia.org/wiki/ASCII – Ry- Feb 27 '19 at 22:42
  • 4
    because the ASCII value of the string '5' is 53. If you want to print it as a string then use %c – OldProgrammer Feb 27 '19 at 22:42
  • I know what ASCII is and how printf works. Take a look at the code again, the castChar is a type-casted char. And it prints 5, that is the translated value from ASCII. The realChar is initialized to a char, and not type-casted as castChar. Both are chars but they print out different values. – MoonOnAStick Feb 27 '19 at 22:45
  • @OldProgrammer You cannot assume ASCII character set – lost_in_the_source Feb 27 '19 at 22:45
  • 3
    `(char)5` and `'5'` mean completely different things. – user2357112 Feb 27 '19 at 22:50
  • @user2357112 could you please explain how they are different? From my understanding, a type conversion is basically a change of how the data is stored in the memory. – MoonOnAStick Feb 27 '19 at 22:51
  • 2
    @OldProgrammer `'5'` is not string, it's character. String comes in `" "` double quote in C. https://stackoverflow.com/a/3683613/7508077 – EsmaeelE Feb 27 '19 at 22:53
  • C is not an object-oriented language and casting != conversion. – Eli Korvigo Feb 27 '19 at 22:59
  • Link explain difference between casting and conversion: https://softwareengineering.stackexchange.com/a/133078/325759 – EsmaeelE Feb 27 '19 at 23:30

2 Answers2

3

Because the value of castChar is the integer value 5, while the value of realChar is the integer encoding of the character value '5' (ASCII 53). These are not the same values.

Casting has nothing to do with it. Or, more accurately, casting nr to char doesn't give you the character value '5', it just assigns the integer value 5 to a narrower type.

If you expect the output 5, then you need to print realChar with the %c specifier, not the %d specifier.

John Bode
  • 119,563
  • 19
  • 122
  • 198
  • i hear in C we can use other character set instead of ASCII, is it possible that `realChar` in the `char realChar = '5';` have different value than 53? – EsmaeelE Feb 27 '19 at 22:56
  • Does that mean that the maximum value of casted ints differs from non-casted ints? – MoonOnAStick Feb 27 '19 at 22:58
  • 1
    @EsmaeelE: C leaves it up to the underlying platform - as long as the basic character set is supported and encodings for decimal digits are contiguous, it doesn't really care whether the underlying representation is ASCII or EBCDIC or whatever. – John Bode Feb 27 '19 at 22:59
  • @MoonOnAStick: Not quite sure what you mean by that. If you try to cast an integer value to a *signed* integer type that's too small to hold it (for example, `signed char x = (signed char) 12345;`), you'll get an implementation-defined result (or a trap). If you try to cast an integer value to an *unsigned* type that's too small, you'll get the modulus. – John Bode Feb 27 '19 at 23:09
  • Is is true? If the system coding is EBCDIC, `char realChar = '5'` put in `realChar`, `hF5: 11110101` the decimal equivalent 20. – EsmaeelE Feb 27 '19 at 23:19
  • 1
    @EsmaeelE: Decimal equivalent 245, but yes, it will be a different value from the ASCII code for `'5'`. – John Bode Feb 27 '19 at 23:23
  • _"i hear in C we can use other character set instead of ASCII"_ - You cannot arbitrarily choose the character set or encoding - that is platform specific not something you can select in the language. On modern platforms you generally have a choice between ASCII/ANSI and Unicode in the _execution environment_, but it is complicated because Unicode does not define a single encoding and there are several variable width encodings. – Clifford Feb 27 '19 at 23:41
  • Talk of EBCDIC is largely relevant - you cannot choose to use it if the platform does not use it and platforms that do are obsolete - its encoding is designed to be convenient for use on punched cards. Which makes it inconvenient for almost any other purpose. – Clifford Feb 27 '19 at 23:46
  • @Clifford _On modern platforms you generally have a choice between ASCII/ANSI_, How i can choose between them? And why we do this selections? what is application of that things? Is it possible to code on a machine with ASCII coding like my current PC and develop an application that must run on other coding like ANSI? – EsmaeelE Feb 28 '19 at 00:16
  • @EsmaeelE: You don’t make that choice. It’s whatever comes with the platform. – John Bode Feb 28 '19 at 03:13
  • @EsmaeelE: It is possible to write code that is designed to work correctly regardless of the source or execution character set. – jxh Feb 28 '19 at 10:55
  • @EsmaeelE ; SO is not a discussion forum; you should not ask further questions in comments - especially comments against an answer where the question is unrelated. Unicode supports thousands of code-points and thus supports Kanji, Catacana, Hebrew, Cyrilic, European accented characters etc. - even Egyptian hieroglyphs! With 8 bit character sets, you are limited to languages with small alphabets and the use of locale specific code-pages. C itself does not have direct support for Unicode or code-pages - its is a platform issue. Research first: https://unicodebook.readthedocs.io/index.html – Clifford Feb 28 '19 at 12:33
1

(char)5 and '5' are not the same thing.

The literal '5' is an integer value that represents the character 5. Its value depends on the platform. Assuming ASCII representation for characters, this would be 53.

The literal (char)5 is the integer value 5 that has been cast to type char. This means it retains the value of 5 after the cast.

jxh
  • 69,070
  • 8
  • 110
  • 193
  • What's the point of type-casting if it doesn't behave as the type casted? – MoonOnAStick Feb 27 '19 at 23:03
  • Casts should generally be avoided. There is almost always a way to write code that uses a type cast into cleaner code that does not use it. Type-casting in C would be used to reinterpret a type as a different type, such as a signed integer as an unsigned integer. – jxh Feb 27 '19 at 23:05
  • 1
    Strict aliasing requires an object to only be interpreted by its own type or as `char`. So, you will sometimes see an object pointer be treated as a `char *`, and sometimes this will be done via a cast. – jxh Feb 27 '19 at 23:09
  • @MoonOnAStick: it *does* behave like the type casted - if necessary, the value being cast will be converted to a different representation. It will not change the *value*, though. Again, the integer value `5` and the character value `'5'` are not the same, and a simple cast won't convert one to the other. – John Bode Feb 27 '19 at 23:19
  • Another case where a cast is required is to make sure a pointer is passed into a variable argument function that expects to see a NULL pointer. Since `NULL` may be defined as a bare `0`, it would be passed as an `int` without a cast. – jxh Feb 27 '19 at 23:31