Here's my procedure:
static void undo_bitstring(std::string& str) {
for (unsigned i = 0; i < str.length(); i += charbits) {
int ascii_val = 0;
for (unsigned j = 0; j < charbits; j++) {
if (str[i+j] == '1') ascii_val += (int)exp2(charbits-j-1);
}
str[i/charbits] = (char)ascii_val;
}
str.erase(str.begin()+str.size()/charbits,str.end());
}
where, just so you know,
charbits
was defined by
static const size_t charbits = 8 * sizeof(char);
What is supposed to happen is, for example,
std::string str = "01010111";
undo_bitsring(str);
should change str to
"W"
since
0x2^7 + 1x2^6 + 0x2^5 + 1x2^4 + 0x2^3 + 1x2^2 + 1x2^1 + 1x2^0
= 64 + 16 + 4 + 2 + 1
= 87
and
(int)'W' = 87
And of course this procedure is supposed to work for any string of 0's and 1's with a length that is a multiple of charbits. For instance,
std::string str = "010101110101011101010111";
undo_bitsring(str);
should change str to
"WWW"
On the tests I've run, the output just looks like a bunch of boxes with question marks inside them, indicating some sort of error.
Any ideas?
Am I totally going about this wrong in the first place? This is part of some encryption/decryption algorithm I'm trying to make.