3

I am writing a client/server project that need a signature. I use base64(hmac-sha1(key, data)) to generate a signature. But I got different signatures between python code and objective-c code:

get_signature('KEY', 'TEXT')   //python get 'dAOnR2oXWP9xa4vUBdDvVXTpzQo='
[self hmacsha1:@"KEY" @"TEXT"] //obj-c  get '7FH0NG0Ou4nb5luKUyjfrdWunos='

Not only the base64 values are different, the hmac-sha1 digest values are different too. I'm trying to work it out with my friend for a few hours, still don't get it. Where is the problem of my code?

My python code:

import hmac
import hashlib
import base64
def get_signature(key, msg):
    return base64.b64encode(hmac.new(key, msg, hashlib.sha1).digest())

My friend's objective-c code (copy from Objective-C sample code for HMAC-SHA1):

(NSString *)hmac_sha1:(NSString *)key text:(NSString *)text{

    const char *cKey  = [key cStringUsingEncoding:NSASCIIStringEncoding];
    const char *cData = [text cStringUsingEncoding:NSASCIIStringEncoding];

    unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
    CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);

    NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
    NSString *hash = [GTMBase64 stringByEncodingData:HMAC];
    return hash;
}

SOLVED: Thanks for everyone below. But I'm not gotta tell you that the real reason is I typed "TE S T" in my python IDE while typed "TE X T" in this post :P

For not wasting your time, I made some tests and got a nicer solution, base on your answers:

print get_signature('KEY', 'TEXT')                        
# 7FH0NG0Ou4nb5luKUyjfrdWunos=

print get_signature(bytearray('KEY'), bytearray('TEXT'))  
# 7FH0NG0Ou4nb5luKUyjfrdWunos=

print get_signature('KEY', u'你好'.encode('utf-8'))  # best solution, i think!
# PxEm7Oibj7ijZ55ko7V3isSkD1Q=

print get_signature('KEY', bytearray(u'你好'))           
# TypeError: unicode argument without an encoding

print get_signature('KEY', u'你好')                      
# UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-1: ordinal not in range(128)

print get_signature(u'KEY', 'TEXT')                      
# TypeError: character mapping must return integer, None or unicode

print get_signature(b'KEY', b'TEXT')                       
# 7FH0NG0Ou4nb5luKUyjfrdWunos=

Conclusion:

  1. The message to be signature should be encoded to utf-8 string with both sides.
  2. (Thanks to DJV)In python 3, strings are all unicode, so they should be used with 'b', or bytearray(thanks to Burhan Khalid), or encoded to utf-8 string.
Community
  • 1
  • 1
clark
  • 131
  • 1
  • 11
  • According to [this site](http://hash.online-convert.com/sha1-generator), your friend is the right one – dmg Mar 14 '13 at 09:30
  • Well, that was weird, your code produces `7FH0NG0Ou4nb5luKUyjfrdWunos=` as well on Python 2. Are you using Python 3? – dmg Mar 14 '13 at 09:32
  • 2
    Ok, final follow up. If you are using Python 3, you should call it like `get_signature(b'KEY', b'TEXT')` as `string`s are `unicode` – dmg Mar 14 '13 at 09:34

1 Answers1

3

Your friend is completely right, but so are you (sorta). Your function is completely right in both Python 2 and Python 3. However, your call is a little erroneous in Python 3. You see, in Python 3, strings are unicode, so in order to pass an ASCII string (as your Objective C friend does and as you would do in Python 2), you need to call your function with:

get_signature(b'KEY', b'TEXT')

in order to specify that those strings are bytes a.k.a. ASCII strings.

EDIT: As Burhan Khalid noted, the flexible way of doing this in Python 3 is to either call your function like this:

get_signature(key.encode('ascii'), test.encode('ascii'))

or define it as:

def get_signature(key, msg):
    if(isinstance(key, str)):
        key = key.encode('ascii')
    if(isinstance(msg, str)):
        msg = msg.encode('ascii')
    return base64.b64encode(hmac.new(key, msg, hashlib.sha1).digest())
Community
  • 1
  • 1
dmg
  • 7,438
  • 2
  • 24
  • 33