-4

In the C programming language, if an int is 4 bytes and letters are represented in ASCII as a number (also an int), then why is a char 1 byte?

  • 1
    `sizeof(char)` is 1 because that's what the standard defines it to be no matter how many bits it is. – Shawn Aug 07 '21 at 02:10
  • 1
    You question has already been answered very well here: https://stackoverflow.com/a/5030541/10183216 – CJay Horton Aug 07 '21 at 03:28

3 Answers3

1

A char is one byte because the standard says so. But that's not really what you are asking. In terms of the decimal values of a char it can hold from -128 to 127, have a look at a table for ASCII character codes, you'll notice that the decimal values of those codes are between 0 and 127, hence, they fit in positive values of a char. There are extended character sets that use unsigned char, values from 0 to 255.

LEF
  • 188
  • 1
  • 9
1
6.2.5 Types
...
3 An object declared as type char is large enough to store any member of the basic execution character set. If a member of the basic execution character set is stored in a char object, its value is guaranteed to be nonnegative. If any other character is stored in a char object, the resulting value is implementation-defined but shall be within the range of values that can be represented in that type.
...
5 An object declared as type signed char occupies the same amount of storage as a ‘‘plain’’ char object. A ‘‘plain’’ int object has the natural size suggested by the architecture of the execution environment (large enough to contain any value in the range INT_MIN to INT_MAX as defined in the header <limits.h>).
C 2012 Online Draft

Type sizes are not defined in terms of bits, but in terms of the range of values that must be represented.

The basic execution character set consists of 96 or so characters (26 uppercase Latin characters, 26 lowercase latin characters, 10 decimal digits, 29 graphical characters, space, vertical tab, horizontal tab, line feed, form feed); 8 bits is more than sufficient to represent those.

int, OTOH, must be able to represent a much wider range of values; the minimum range as specified in the standard is [-32767..32767]1, although on most modern implementations it’s much wider.


  1. The standard doesn’t assume two’s complement representation of signed integers, which is why INT_MIN is -32767 and not -32768.
John Bode
  • 119,563
  • 19
  • 122
  • 198
0

In the C language, a char usually has a size of 8 bits. In all the compilers that I have seen (which are, admittedly, not very many), the char is taken to be large enough to hold the ASCII character set (or the so called “extended ASCII”) and the size of the char data type is 8 bits (this includes compilers in major Desktop platforms, and a some embedded systems). 1 byte was sufficient to represent the whole character set.

Rasathurai Karan
  • 673
  • 5
  • 16