I have seen many examples of how to implement Base64 encoders. But none of them are using struct
inside of a union to accomplish the translation from three 8-bit blocks to four 6-bit blocks. And I have wondered why no one uses this method, because for me it looks like a easy and fast method.
I wrote an example in of the union-struct.
namespace Base64
{
typedef union
{
struct
{
uint32_t b2 : 0x08;
uint32_t b1 : 0x08;
uint32_t b0 : 0x08;
uint32_t pad : 0x08;
} decoded;
struct
{
uint32_t b3 : 0x06;
uint32_t b2 : 0x06;
uint32_t b1 : 0x06;
uint32_t b0 : 0x06;
uint32_t pad : 0x08;
} encoded;
uint32_t raw;
} base64c_t;
}
I have tested to translate 0xFC0FC0
or in binary 111111000000111111000000
into four 6-bits block with this method, and it seems to work.
Base64::base64c_t b64;
b64.decoded.b0 = 0xFC;
b64.decoded.b1 = 0x0F;
b64.decoded.b2 = 0xC0;
std::cout.fill ( '0' );
std::cout << "0x" << std::hex << std::setw ( 2 ) << b64.encoded.b0 << std::endl;
std::cout << "0x" << std::hex << std::setw ( 2 ) << b64.encoded.b1 << std::endl;
std::cout << "0x" << std::hex << std::setw ( 2 ) << b64.encoded.b2 << std::endl;
std::cout << "0x" << std::hex << std::setw ( 2 ) << b64.encoded.b3 << std::endl;
Output:
0x3f
0x00
0x3f
0x00
Is there a downside with this way to translate 8-bit blocks to 6-bit blocks? Or haven't anyone thought about this way earlier?