I'm making my own implementation of a bit array in C based on the java BitSet where they use 64 bit Long type, but in C we can't guarantee a 64 bit integer. I've defined an integer_t
where I use it everywhere in my project:
typedef intmax_t integer_t;
The bit array is a buffer of integer_t
and the following function will map a bit index to a buffer index:
static integer_t bit_buffer_index(integer_t bit_index)
{
return bit_index >> bit_shifts;
}
Static because it is an implementation detail. Also bit_shifts
is defined in the source file as:
static const integer_t bit_shifts =
((sizeof(integer_t) * 8) >> 6) > 0
? 6
: ((sizeof(integer_t) * 8) >> 5) > 0
? 5
: ((sizeof(integer_t) * 8) >> 4) > 0
? 4
: 3; // One byte
But is this the correct approach to define bit_shifts
? Is there a macro for this? How would you implement this? It seems like I'm missing something.