When you binary serialize an object in .Net using the BinaryFormatter you end up with a byte array which is obviously meaningless to humans.
Does this byte array correspond to a more meaningful string representation which is human readable? Or do you need to fully deserialize it in order to make it more human readable?
I would expect that the binary formatter has some intermediate string representation of the object which it uses before emitting the byte array. That would be perfect for my needs...
I tried Base64 encoding the byte array, but I just ended up with gibberish.
EDIT:
As explained in my answer, UTF8 encoding is the best you can get.
The reason I want to do this is so that I can diff two binarySerializations and only store the first serialization and the diff, and was interested in seeing how the serialization worked in order to work out how best to diff the byte array.