I'm experiencing a problem integrating the output of my SHA256 with HMAC. The first/'inner' SHA256 matches the expected output but the second/'outer' does not.
I am following the first example here:
I know:
- My K0 is correct
- My
K0 xor ipad
is correct - My
K0 xor opad
is correct - My
Hash((Key^ipad)||text)
is even correct
The problem is the ASCII representation from the final/outer sha256()
is incorrect.
I get 218135127ad7a8e5967ce7b47499214d1df46a9589eb0bec7637c86021fd928d
but it should be 8BB9A1DB9806F20DF7F77B82138C7914D174D59E13DC4D0169C9057B133E1D62
Code:
// Up to this point ipad, opad, K0, both XORs all match the expected values
const std::array<uint32_t, 8>& h_output_1 = Hash<BUFFER_SIZE>(&xor_1[0], 64, &message[0], message_length);
// This is correct
LOG("H_1: " << sha256::GetASCIIRepresentation(h_output_1));
const std::array<uint32_t, 8>& h_output_2 = Hash<BUFFER_SIZE>(&xor_2[0], 64, &h_output_1[0], 32);
// This is NOT correct
LOG("H_2: " << sha256::GetASCIIRepresentation(h_output_2));
Hash()
appends the two buffers and calculates the SHA256.
Given the first SHA256 works I think the problem is how the 'inner' 8x uint32_ts
SHA256 output is appended to K0 xor opad
for the final 'outer' SHA256 calculation.
Given the first SHA256 works, can anyone suggest anything?
NB:
I cannot post GetSHA256()
but it just returns the final state of the 8x uint32_t
s used during the SHA256 transformations.
GetASCIIRepresentation()
splits each uint32_t
in to 8x 4-bit integers using masking and shifting, then converts to ASCII.
GetSHA256()
and GetASCIIRepresentation()
are unit tested and work correctly on message inputs beyond 512 bits.