HTTP (or any protocol/system) being 8-bit clean does not mean that you can simply use any 8-bit value wherever you want within the protocol. It means only that the protocol or system is capable of handling 8-bit encoding given the right circumstances.
For example, HTTP uses carriage return+line feed (Hex values 0D0A
) to delimit header fields and the body of the message, so you can't use those values together anywhere in the headers. Further, the headers and body may have limitations on their character encoding based on what type of data is contained in them. If the HTTP Content-Type is set to text/html; charset=utf-8
, characters in the body like <
(Hex value 3C
) are reserved for HTML tags. The HTTP body may be 8-bit clean, but that doesn't mean you can put any 8-bit content you want in it, you still have to conform to UTF-8 (or some other encoding) and abide by the content rules that HTML imposes.
The purpose of Base64 is to encode arbitrary binary data for use inside other encoding schemes where characters other than [A-Za-z0-9+/]
are reserved for special uses, or are totally invalid (such as inside HTML, or in a URL query string). You cannot just replace Base64 with a full 8-bit encoding scheme because an 8-bit scheme is not valid in situations where Base64 is necessary. This is true even if the protocol you're using is, itself, 8-bit clean.
In short, whatever binary encoding scheme you use is dependent on much more than just 8-bit clean vs not 8-bit clean. It depends on the protocol you're using the encoding inside of, what the protocols control characters are, and in what situations those characters are reserved.
Update:
If all you're really looking to do is return raw binary in an HTTP response, just set the HTTP Content-Type to application/octet-stream
. This will allow you to return arbitrary binary in the HTTP body without any need for encoding.