I use to think that the use of binary encoding is because every device has its way to interpret bytes. Thus if a router sends a bit as some significant information other router might treat this byte as a parity byte or something else... But isn't it all already covered in character encoding?? I mean character encoding tells what byte is representing which character, right? (Or am I missing something? ) Isn't the information about character encoding(like UTF-8) enough for devices to read bytes directly? If yes why would anyone want to encode this (Using something like base64) cause it will increase the size of the data required to be transferred.
Asked
Active
Viewed 522 times
1
I just found the answer over here: https://stackoverflow.com/a/201510/169513 I'm not sure whether you are still looking for the answer, but hope this helps! – Mugen Sep 02 '22 at 14:07