I was confused with the next (simplified) piece of code. While user registration it encrypts password, converts hash to string and saves it in database. After that user tries to login, code reads password from db, get bytes of it, and compare with encrypted hash of password, which user entered.
static void Main(string[] args)
{
//User registration
byte[] passwordBytes = Encoding.Unicode.GetBytes("P@ssword");
byte[] hashBytes = GetHash(passwordBytes);
string stringFieldInDb = Encoding.Unicode.GetString(hashBytes); //password hash is being stored in database
//Check password
byte[] hashBytesInDb = Encoding.Unicode.GetBytes(stringFieldInDb); //was read from database
byte[] enteredPasswordBytes = Encoding.Unicode.GetBytes("P@ssword");
byte[] enteredPasswordHash = GetHash(enteredPasswordBytes);
//is false
var isPasswordValid = hashBytesInDb.SequenceEqual(enteredPasswordHash);
//this way is true
var isPasswordValid2 = stringFieldInDb == Encoding.Unicode.GetString(enteredPasswordHash);
}
private static byte[] GetHash(byte[] data)
{
return new SHA512CryptoServiceProvider().ComputeHash(data);
}
Hashes are little bit different, bytes of hash string from database:
161, 127, 0, 49, 27, 146, **253, 255**, 109, 214, **253, 255**, 113, 75, 226, ...
Bytes of hash string generated from entered password in login:
161, 127, 0, 49, 27, 146, **74, 219**, 109, 214, **65, 220**, 113, 75, 226, ...
I shortened the above example to three lines, and I wonder what is the reason of that result?
byte[] someCharBytes = new byte[] { 74, 219 };
string someChar = Encoding.Unicode.GetString(someCharBytes);
byte[] differentSomeCharBytes = Encoding.Unicode.GetBytes(someChar); //returns { 253, 255 }