I'm trying to build a simple BIN -> HEX converter using a class, I would like to save it later in a header file for eventual need <.<" .
It kinda works. Kinda because I have some output, but I cannot understand what is happening when it prints X. Why Am I getting those else exceptions ? I should get only 4 bit combinations..
I'm trying to learn. Sorry for eventual stupid code.
#include <iostream>
#include <bitset>
class Hash{
private:
char stringa[150];
int byteCount=0;
public:
//call to get a string
void getStringa(){
char temp_char;
std::cout << "Write a string and press enter to continue" << std::endl;
for(unsigned int i = 0; i < 150; i++){
temp_char = std::cin.get();
if(temp_char == '\n'){
stringa[i] = '\0';
byteCount = i;
break;
}
stringa[i] = temp_char;
}
}
char nibbleToHEX(std::bitset<4> x){
char HEX;
if(x == 0000) return HEX = '0';
else if (x == 0001) return HEX = '1';
else if (x == 0010) return HEX = '2';
else if (x == 0011) return HEX = '3';
else if (x == 0100) return HEX = '4';
else if (x == 0101) return HEX = '5';
else if (x == 0110) return HEX = '6';
else if (x == 0111) return HEX = '7';
else if (x == 1000) return HEX = '8';
else if (x == 1001) return HEX = '9';
else if (x == 1010) return HEX = 'A';
else if (x == 1011) return HEX = 'B';
else if (x == 1100) return HEX = 'C';
else if (x == 1101) return HEX = 'D';
else if (x == 1110) return HEX = 'E';
else if (x == 1111) return HEX = 'F';
else return 'X';
}
//call to encode string to 256 binary digits and then go HEX a nibble at a time
void encodeStringa(){
std::cout << "converting |" << stringa << "| to binary: \n";
char HEXSTRINGA[64];
for(unsigned int i = 0; i < 150; i++){
if(stringa[i] == '\0') break;
std::bitset<4> x(stringa[i]);
std::cout << x;
HEXSTRINGA[i] = nibbleToHEX(x);
}
std::cout << std::endl;
std::cout << "You used " << byteCount << " bytes.\n";
std::cout << "You still have " << 64-byteCount << " bytes." << std::endl;
std::cout << "Converted string in HEX form: " << HEXSTRINGA << std::endl;
}
};
int main() {
Hash BCHAIN;
BCHAIN.getStringa();
BCHAIN.encodeStringa();
return 0;
}
Some test IO is:
**Teststring**
0100010100110100001101000010100111100111
XXBXBXA3XF
X is an error at least for what I am trying to do.. I don't get it why, I would expect random combination of 4 bits for some chars.. because I only have 15 combinations with 4 bits. Not an X ... is an overflow issue?
x with p variable in bases of 2. I have noticed that when I receive SPACE or other char not defined it occurs in the error. So if I use a bitset with p as 16 I should manage to get a UNICODE conversion .. or I am wrong?
– mustache Mar 01 '21 at 23:39