In Python
li = [0] * 3
li[0] = 1
# li == [1, 0, 0]
This works fine.
li = [set()] * 3
li[0].add(1)
# li == [{1}, {1}, {1}]
This is adding up 1 for all in the list. What is the difference between this two?
In Python
li = [0] * 3
li[0] = 1
# li == [1, 0, 0]
This works fine.
li = [set()] * 3
li[0].add(1)
# li == [{1}, {1}, {1}]
This is adding up 1 for all in the list. What is the difference between this two?
Here:
li = [0] * 3
li[0] = 1
You create a list with three elements, all of which are the immutable number 0. Then you replace the first element with the immutable number 1.
Whereas here:
li = [set()] * 3
li[0].add(1)
You create a single set (not three sets), and put it into a list three times. You then reference the first element of the list, which of course is the set, and add a number into it. Since only one set was ever created, this appears to modify every element of the list (because they are all the same).
For more about the dangers of [x] * n
in Python, see: List of lists changes reflected across sublists unexpectedly
Objects in Python are stored and used by reference. When you write:
li = [set()] * 3
you are creating an object and then copying its reference; rather than the object itself.