1

I am trying to build trees in Python, based on a Node class, whose objects have two attributes: a label and a list of children (pointers to nodes).

For convenience, I make the argument c (list of children) of __init__ optional (default = []), as in most cases, my nodes do not have children when I am creating them.

My code is below:

class Node:
    def __init__(self, l, c=[]):
        self.label = l
        self.children = c

    def insert(self, x):
        self.children.append(Node(x))

t1 = Node(0)    # new tree, with root = 0
t1.insert(1)    # new node = 1, child of 0
t2 = Node(2)    # new tree, with root = 2

### Output: (addresses are modified for readability)
# t1 = <__main__.Node instance at 0>
# t1.children = [ <__main__.Node instance at 1> ]
# t2 = <__main__.Node instance instance at 2>
# ... so far so good ...
# t1.children[0].children = [ <__main__.Node instance at 1> ]
# t2.children = [ <__main__.Node instance at 1> ]

As illustrated by the above output, the value of children is the same for all nodes. This remains true if I type (e.g.) t1.insert(3); t2.insert(4), in which case all nodes would have 3 children (1, 3 and 4).

However, the problem disappears when I explicitly specify c (whether it is the empty list or something else) when creating my objects...

Not sure if it is my understanding of how Python initializes objects or how Python handles optional arguments that is completely wrong... Could someone explain what is going on here?

R2B2
  • 1,541
  • 1
  • 12
  • 19

0 Answers0