8

How do I get the number of bits in type char?

I know about CHAR_BIT from climits. This is described as »The macro yields the maximum value for the number of bits used to represent an object of type char.« at Dikumware's C Reference. I understand that means the number of bits in a char, doesn't it?

Can I get the same result with std::numeric_limits somehow? std::numeric_limits<char>::digits returns 7 correctly but unfortunately, because this value respects the signedness of the 8-bit char here…

mkluwe
  • 3,823
  • 2
  • 28
  • 45

5 Answers5

15

CHAR_BIT is, by definition, number of bits in the object representation of type [signed/unsigned] char.

numeric_limits<>::digits is the number of non-sign bits in the value representation of the given type.

Which one do you need?

If you are looking for number of bits in the object representation, then the correct approach is to take the sizeof of the type and multiply it by CHAR_BIT (of course, there's no point in multiplying by sizeof in specific case of char types, since their size is always 1, and since CHAR_BIT by definition alredy contains what you need).

If you are talking about value representation then numeric_limits<> is the way to go.

For unsigned char type the bit-size of object representation (CHAR_BIT) is guaranteed to be the same as bit-size of value representation, so you can use numeric_limits<unsigned char>::digits and CHAR_BIT interchangeably, but this might be questionable from the conceptual point of view.

AnT stands with Russia
  • 312,472
  • 42
  • 525
  • 765
  • Well, I forgot to mention why I was not content with using CHAR_BIT: I would like to get the number of bits in some templated code, and numeric_limits<> would fit in better in my first view. But the approach of using sizeof(char) * CHAR_BIT as mentioned by R Samuel Klatchko does the job. – mkluwe Feb 12 '10 at 10:50
10

If you want to be overly specific, you can do this:

sizeof(char) * CHAR_BIT

If you know you are definitely going to do the sizeof char, it's a bit overkill as sizeof(char) is guaranteed to be 1.

But if you move to a different type such as wchar_t, that will be important.

R Samuel Klatchko
  • 74,869
  • 16
  • 134
  • 187
2

Looking at the snippets archive for this code here, here's an adapted version, I do not claim this code:

int countbits(char ch){
    int n = 0;
    if (ch){
        do n++;
        while (0 != (ch = ch&(ch-1)));
    }
    return n;
}

Hope this helps, Best regards, Tom.

t0mm13b
  • 34,087
  • 8
  • 78
  • 110
1

A non-efficient way:

char c;
int bits;
for ( c = 1, bits = 0; c; c <<= 1, bits++ )
   ;
printf( "bits = %d\n", bits );
Mark Wilkins
  • 40,729
  • 5
  • 57
  • 110
0

no reputation here yet so I'm not allowed to reply to @t0mm13b 's answer, but wanted to point out that there's a problem with the code:

int countbits(char ch){
    int n = 0;
    if (ch){
        do n++;
        while (0 != (ch = ch&(ch-1)));
    }
    return n;
}

The above won't count the number of bits in a character, it will count the number of set bits (1 bits).

For example, the following call will return 4:

char c = 'U';
countbits(c);

The code:

ch = ch & (ch - 1)

Is a trick to strip off the right most (least significant) bit that's set to 1. So, it glosses over any bits set to 0 and doesn't count them.

stconnell
  • 56
  • 2