14

8 bits is called "byte". How is 16 bits called? "Short"? "Word"?

And what about 32 bits? I know "int" is CPU-dependent, I'm interested in universally applicable names.

Ecir Hana
  • 10,864
  • 13
  • 67
  • 117
  • *I'm interested in universally applicable names.* Then the universally applicable names you should use are "16-bits" and "32-bits". If your universally applicable names context precludes **byte** specifically for 8-bits, you can use "8-bits" or "octet". – Eljay Jul 22 '21 at 13:01

8 Answers8

22

A byte is the smallest unit of data that a computer can work with. The C language defines char to be one "byte" and has CHAR_BIT bits. On most systems this is 8 bits.

A word on the other hand, is usually the size of values typically handled by the CPU. Most of the time, this is the size of the general-purpose registers. The problem with this definition, is it doesn't age well.

For example, the MS Windows WORD datatype was defined back in the early days, when 16-bit CPUs were the norm. When 32-bit CPUs came around, the definition stayed, and a 32-bit integer became a DWORD. And now we have 64-bit QWORDs.

Far from "universal", but here are several different takes on the matter:

Windows:

  • BYTE - 8 bits, unsigned
  • WORD - 16 bits, unsigned
  • DWORD - 32 bits, unsigned
  • QWORD - 64 bits, unsigned

GDB:

  • Byte
  • Halfword (two bytes).
  • Word (four bytes).
  • Giant words (eight bytes).

<stdint.h>:

  • uint8_t - 8 bits, unsigned
  • uint16_t - 16 bits, unsigned
  • uint32_t - 32 bits, unsigned
  • uint64_t - 64 bits, unsigned
  • uintptr_t - pointer-sized integer, unsigned

(Signed types exist as well.)

If you're trying to write portable code that relies upon the size of a particular data type (e.g. you're implementing a network protocol), always use <stdint.h>.

Jonathon Reinhart
  • 132,704
  • 33
  • 254
  • 328
  • Spot On. These terms are extremely importand when dealing with Computer Architecture & Assembly Language (low level Programming). – InamTaj Oct 24 '14 at 18:42
4

The correct name for a group of exactly 8 bits is really an octet. A byte may have more than or fewer than 8 bits (although this is relatively rare).

Beyond this there are no rigorously well-defined terms for 16 bits, 32 bits, etc, as far as I know.

Paul R
  • 208,748
  • 37
  • 389
  • 560
4

There's no universal name for 16-bit or 32-bit units of measurement.

The term 'word' is used to describe the number of bits processed at a time by a program or operating system. So, in a 16-bit CPU, the word length is 16 bits. In a 32-bit CPU, the word length is 32 bits. I also believe the term is a little flexible, so if I write a program that does all its processing in chunks of say, 10 bits, I could refer to those 10-bit chunks as 'words'.

And just to be clear; 'int' is not a unit of measurement for computer memory. It really is just the data type used to store integer numbers (i.e. numbers with a decimal component of zero). So if you find a way to implement integers using only 2 bits (or whatever) in your programming language, that would still be an int.

chm
  • 1,519
  • 13
  • 21
3

Dr. Werner Buchholz coined the word byte to mean, "a unit of digital information to describe an ordered group of bits, as the smallest amount of data that a computer could process." Therefore, the word's actual meaning is dependent on the machine in question's architecture. The number of bits in a byte is therefore arbitrary, and could be 8, 16, or even 32.

For a thorough dissertation on the subject, refer to Wikipedia.

Ian Atkin
  • 6,302
  • 2
  • 17
  • 24
  • 2
    As the Wikipedia article says, the de facto modern usage is that a "byte" is always 8 bits (even for machines that use 32- or 64-bit words). Other definitions of byte did float around in the early days of computing, but those are effectively archaic usages now. – Luke Feb 01 '17 at 23:43
  • @Luke I wouldn't say "archaic"; there are DSP chips where `CHAR_BIT` is 16 or 32: https://stackoverflow.com/questions/32091992/is-char-bit-ever-8 – Jonathon Reinhart Jan 28 '19 at 18:19
1

short, word and int are all dependent on the compiler and/or architecture.

  • int is a datatype and is usually 32-bit on desktop 32-bit or 64-bit systems. I don't think it's ever larger than the register size of the underlying hardware, so it should always be a fast (and usually large enough) datatype for common uses.
  • short may be of smaller size then int, that's all you know. In practice, they're usually 16-bit, but you cannot depend on it.
  • word is not a datatype, it rather denotes the natural register size of the underlying hardware.

And regarding the names of 16 or 32 bits, there aren't any. There is no reason to label them.

gustaf r
  • 1,224
  • 9
  • 14
0

I used to ear them referred as byte, word and long word. But as others mention it is dependant on the native architecture you are working on.

kmkaplan
  • 18,655
  • 4
  • 51
  • 65
0

They are called 2 bytes and 4 bytes

Lieuwe
  • 1,734
  • 2
  • 27
  • 41
0

There aren't any universal terms for 16 and 32 bits. The size of a word is machine dependent.

Chris McCabe
  • 1,010
  • 11
  • 20