In C, there's the sizeof
operator to determine the byte-size of a given data type or object.
Likewise, there's CHAR_BIT
from <limits.h>
which is defined to reflect the number of bits in a byte.
Now this might be slightly hypothetical, but how do I tell the number of different values that the smallest unit of information can store, i.e. whether the host environment provides bits, trits, nats or whatever.
Answer
Apparently, the C standard assumes that the host environment operates on bits. Such a bit is required to be able to store at least two values.
Notable proposals that arose from this question
Name of the smallest unit of information of a ternary machine: a TIT
Name of the smallest unit of information of a quaternary machine: a QUIT