Consider the following code in Python 3.5.2:
class NewObj():
def __init__(self, refs={}):
self.refs = refs
class ObjA(NewObj):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
class ObjB(NewObj):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.refs['c'] = 3
a = ObjA(refs={'a':1, 'b': 2})
b = ObjA()
c = ObjB()
lst = [a, b, c]
for obj in lst:
print('%s has refs: %s' % (obj, obj.refs))
The output of the code is:
<__main__.ObjA object at 0x7f74f0f369b0> has refs: {'a': 1, 'b': 2}
<__main__.ObjA object at 0x7f74f0f36a90> has refs: {'c': 3}
<__main__.ObjB object at 0x7f74f0f36ac8> has refs: {'c': 3}
It is the second line of output that is causing me confusion - it seems to me that an empty dictionary should be output. The reason being that because b
is assigned an instance of ObjA
without any arguments being called, b.refs == {}
should be True
, as per the default initialisation.
Is this a bug or desired behaviour? If it's not a bug, could I please get an explanation why this is desired behaviour, and the most minimal change to the code to get the output I intend (i.e. when no arguments are provided, .refs
is initialised to an empty dict)?