I am used to seeing the encoding of flags (i.e., binary values) as bits. See, for example, the SYN and ACK flags in the TCP header.
I recently stumbled upon the specification of Certificate Transparency: https://www.rfc-editor.org/rfc/rfc6962.html
Long story short: the main building block of the Certificate Transparency log is a Merkle tree, a tree of hashes. In order to prevent second preimage attacks, they require to make a distinction between leafs and non-leaf nodes in the tree, which they do by prepending 0x00 to leaf and 0x01 to non-leaf nodes before hashing, see this link.
I'm a bit puzzled because even though this information could be encoded in one bit, the RFC specifies to encode it as a byte (0x00 or 0x01). I am not sure what the rationale is.
To clarify, I understand why they separate leafs from non-leafs and what second preimage attacks are. My question is: why would they encode one bit of information into one whole byte? I am suspecting it has to do with the properties of hash functions, but perhaps there is a simpler explanation.