3

Why is something as fundamental as the number of bits in a byte, been kept implementation-defined by C standard? Are there examples where this could be useful?

from C99 , 3.6 ( available here link)

3.6 byte

addressable unit of data storage large enough to hold any member of the basic character set of the execution environment

NOTE 1 It is possible to express the address of each individual byte of an object uniquely.

NOTE 2 A byte is composed of a contiguous sequence of bits, the number of which is implementation defined. The least significant bit is called the low-order bit; the most significant bit is called the high-order bit.

EDIT: I was asking something fundamental why C standard has given flexibility in the number of bits in the size of a byte. Not asking about the sizeof(char) more specifically what is the benefit of having CHAR_BIT != 8. If the question still seems duplicate please down-vote it and i will close the question.

physicist
  • 844
  • 1
  • 12
  • 24
  • 5
    Not all architectures have 8 bit bytes. – Paul R Feb 22 '18 at 19:42
  • Related (duplicate?): [Are there machines, where sizeof(char) != 1, or at least CHAR_BIT > 8?](https://stackoverflow.com/questions/2215445/are-there-machines-where-sizeofchar-1-or-at-least-char-bit-8) or [Is CHAR_BIT ever > 8?](https://stackoverflow.com/questions/32091992/is-char-bit-ever-8) – Martin R Feb 22 '18 at 19:42
  • 4
    @MartinR sizeof(char) is always 1, by definition. CHAR_BIT may vary. – Bjorn A. Feb 22 '18 at 19:44
  • you may find this answere interesting: https://stackoverflow.com/a/2098227/8513665 – Christian Gibbons Feb 22 '18 at 19:44
  • 2
    @BjornA.: Yes, I know, thanks. I just linked to a Q&A where examples for CHAR_BIT > 8 are given. – Martin R Feb 22 '18 at 19:44
  • 2
    "Why is something as fundamental as the number of bits in a byte, been kept implementation-defined by C standard?" --> to allow C to compile on the widest gamut of platforms and compliers including those that do not have an 8-bit byte. C is very inclusive and was a key factor in its early adoption and its rapid adoption to new exotic systems. – chux - Reinstate Monica Feb 22 '18 at 19:55
  • 1
    [Exotic architectures the standards committees care about](https://stackoverflow.com/questions/6971886/exotic-architectures-the-standards-committees-care-about) – Bo Persson Feb 22 '18 at 19:55
  • @chux that makes sense – physicist Feb 22 '18 at 19:58
  • 1
    AFAIK the language C was developed to make it easier to write operating system code than in assembler. So it has to cater for any OS, including legacy systems. – Weather Vane Feb 22 '18 at 19:59
  • 2
    8 bits is an *octet* - a *byte* is the smallest addressable unit of storage. A byte *may* be the same as an octet, but it doesn't *have* to be. Many of the systems in use at the time C was designed were using byte sizes that weren't 8 bits (Harbison & Steele describe a system that used 36-bit words and could store 5 7-bit ASCII characters per word). In C, a `char` must be *at least* 8 bits wide, but may be wider. – John Bode Feb 22 '18 at 20:57

1 Answers1

5

Many older machines and current-day DSPs have larger bytes (as in: they can only address memory only in multiples of - say - 16 bits). Surely you don't want to leave out an important segment of the embedded world.

Matteo Italia
  • 123,740
  • 17
  • 206
  • 299
  • For the vast majority of C code, the ability to run on such platforms wouldn't add any value whatsoever. Having a language specification describe the behavior of "weird" platforms is often useful; a belief that non-defective programs should be written to accommodate weird implementations whenever possible is not. – supercat Aug 29 '18 at 22:17
  • I agree completely, the "you" was rhetorical (if you like, it's a "you who are writing the C standard", mostly as an answer to the very first question in the OP). – Matteo Italia Aug 29 '18 at 22:21
  • I were writing the Standard, I would make it much more obvious that there are many things which non-garbage-quality compilers will do absent a solid and documented reason for doing otherwise; an expectation that an implementation will behave in such fashion when no good reason exists for it to do otherwise should be viewed as an expectation that an implementation will be of non-garbage quality. Only in rare cases (such as when circumstances necessitate the use of a specialized or garbage-quality implementation) should such expectation be viewed as a defect. – supercat Aug 29 '18 at 22:32