I am new to iOS and Swift and I am using HMACSHA256 algorithm in Java as well as Swift. However, for same input values I am getting different result. I am not sure why.
This is my Java Code
byte[] secret = new byte[] {0, 0, 0, 0, 38, 8, 117, -119};
byte[] message = new byte[] {5, 96, 98, 37, 5, -110, 99, 2, -125,
88, 55};
Mac sha256_HMAC = Mac.getInstance("HmacSHA256");
SecretKeySpec secret_key = new SecretKeySpec(secret,
"HmacSHA256");
sha256_HMAC.init(secret_key);
String hash =
Base64.encodeBase64String(sha256_HMAC.doFinal(message));
System.out.println(hash);
This is my Swift Code
func hmac(algorithm: CryptoAlgorithm, key: String) -> String {
let str = self.cString(using: String.Encoding.utf8)
let strLen = Int(self.lengthOfBytes(using:
String.Encoding.utf8))
let digestLen = algorithm.digestLength
let result =
UnsafeMutablePointer<CUnsignedChar>.allocate(capacity: digestLen)
let keyStr = key.cString(using: String.Encoding.utf8)
let keyLen = Int(key.lengthOfBytes(using:
String.Encoding.utf8))
CCHmac(algorithm.HMACAlgorithm, keyStr, keyLen, str,
strLen, result)
let digest = stringFromResult(result: result, length:
digestLen)
//result.deallocate(capacity: digestLen)
result.deallocate()
return digest
}
private func stringFromResult(result:
UnsafeMutablePointer<CUnsignedChar>, length: Int) -> String {
let hash = NSMutableString()
for i in 0..<length {
hash.appendFormat("%02x", result[i])
}
return String(hash).lowercased()
}
The input to both functions are same
byte[] secret = new byte[] {0, 0, 0, 0, 38, 8, 117, -119};
byte[] message = new byte[] {5, 96, 98, 37, 5, -110, 99, 2, -125, 88, 55};
Below is the result I am expecting
Exact result in byte array [-18, -12, 79, 109, 84, 107, 37, 114, 80, 22, 83, 18, 126, -123, -73, -38, 120, -79, 49, 87, -109, -30, -103, -17, -87, 69, 117, 3, -89, 36, -46, 108]
Result in String - 7vRPbVRrJXJQFlMSfoW32nixMVeT4pnvqUV1A6ck0mw=