-1

I have this array of bits

int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 }

This is 65 in hex or 101 in decimal. The ASCII letter is 'e'. How do I go about reading my array into a char and int (the decimal value)?

fUrious
  • 434
  • 5
  • 14

1 Answers1

4

You could use bit shifting in order to get the char from the bit array like so:

int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 };
char result = 0; // store the result

for(int i = 0; i < 8; i++){
    result += (bits[i] << (7 - i)); // Add the bit shifted value
}

cout << result;

This basically loops through your array, bitshifts by the correct amount, and then adds the value to an aggregating "result" variable. The output should be "e".

Keveloper
  • 773
  • 5
  • 10
  • 1
    I'd probably throw in an `assert(bits[i] ==0 || bits[i] == 1)` just because if it fails, the result will silently return a result. – UKMonkey Jun 25 '18 at 14:39
  • And I'd use `std::bitset` because C != C++ (except if C is a floating point value large enough). – YSC Jun 25 '18 at 14:43
  • `for(int i = 0; i < 8; i++) { result = (result << 1) | bits[i]; }` would work too and imho. is slightly easier to understand. (Neither my nor the answer proposed by Keveloper check that the values of bits[] are either 0 or 1). Better do a `bits[i] == 0 ? 0 : 1` instead of just `bits[i]`. – Dorin Botan Jun 25 '18 at 14:46