I'm currently working on a program that converts to and from base64 in Eclipse. However, I've just noticed that char values seem to have 7 bits instead of the usual 8. For example, the character 'o' is shown to be represented in binary as 1101111 instead of 01101111, which effectively prevents me from completing my project, as I need a total of 24 bits to work with for the conversion to work. Is there any way to either append a 0 to the beginning of the value (i tried bitshifting in both directions, but neither worked), or preventing the issue altogether?
The code for the (incomplete/nonfuntional) offending method is as follows, let me know if more is required:
std::string Encoder::encode( char* src, unsigned char* dest)
{
char ch0 = src[0];
char ch1 = src[1];
char ch2 = src[2];
char sixBit1 = ch0 >> 1;
dest[0] = ch2;
dest[1] = ch1;
dest[2] = ch0;
dest[3] = '-';
}