1

I define bits like this :

#define b00_on  0x0000000000000001
#define b01_on  0x0000000000000002
#define b02_on  0x0000000000000004
#define b03_on  0x0000000000000008
#define b04_on  0x0000000000000010

Each bit corresponds to an array offset like so:

array   bit
offset
0       b00_on 
1       b01_on
2       b02_on
3       b03_on
4       b04_on

I'm looking for a function that will translate bit values to array offsets and I want to avoid compuationally expensive operations that involve multiplication, division, roots and etc.

Currently, I'm using a switch to do the translation like this:

- (int) convertBitToArrayOfst: (int64_t) bit
  {
  int n;

  switch( bit )
     {
     case b00_on: n=0; break;   
     case b01_on: n=1; break;
     case b02_on: n=2; break;
       .
       .
       .
     case b63_on: n=63; break;
     }   
  return( n );
  }

But I'm not thrilled with it. Anyone have a suggestion for a more elegant, low-overhead way to make this translation?

Gallymon
  • 1,557
  • 1
  • 11
  • 21
  • The hex literals are correct, @IanMacDonald; HEX 10 = DEC 16; HEX 16 = DEC 22. This way of writing the constants tripped me up for a second, too. Gallymon, the more usual way to write this would probably be `(1LL << 4)` -- and that would make it easier for you and others to read in the future. – jscs Jan 14 '15 at 22:27

0 Answers0