Very simple CS question. I was reading MD5 documentation, RFC 1321 where it says
The algorithm takes as input a message of arbitrary length and
produces as output a 128-bit "fingerprint" or "message digest" of the input.
Its saying MD5 generates 128-bit=16bytes hash for a given input.
Then, when I use md5
script in unix/macos or a MD5 online generator, its generating 32 chars long hash, meaning 32bytes. (1 char = 1 byte is my understanding)
eg.
$ md5 <<<"1"
b026324c6904b2a9cb4b88d6d61c81d1
$ printf "b026324c6904b2a9cb4b88d6d61c81d1" | wc -c
32
But when I try with java
MD5 api, it gives me 16 bytes hash which is true according to the documentation.
scala> import java.security.MessageDigest
import java.security.MessageDigest
scala> MessageDigest.getInstance("MD5").digest("1".getBytes)
res0: Array[Byte] = Array(-60, -54, 66, 56, -96, -71, 35, -126, 13, -52, 80, -102, 111, 117, -124, -101)
scala> val hash = MessageDigest.getInstance("MD5").digest("1".getBytes("UTF-8")).length
hash: Int = 16
Question is what I missing with the md5
(BSD unix tool).