I got a quite strange bug in my code... at some point I have to convert values to unicode strings, with the exception of some special values (None, True, False) that need special treatment.
The code looks something like this:
conversions = {
None: 'null',
True: 'true',
False: 'false',
}
def translate(val):
return conversions.get(val, unicode(val))
But when the value 1
is passed in, I got back u'true'
while I was expecting u'1'
. So I opened a console and played a little bit:
Python 2.7.5 (default, May 15 2013, 22:43:36) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> d = {True: 'yeah', False: 'boooh'}
>>> d[1]
'yeah'
>>> d[0]
'boooh'
>>> 1 in d
True
>>> 0 in d
True
>>> True is 1
False
So, it turns out that 0
and 1
behave the same way as True
and False
. Why? it is confusing and leads to strange bugs like this...