I have a Graph object with an attribute adj_list, which is initialized to a list of empty sets. When I call my addEdge(self, u, v) function, the expected behavior is that u gets added to the set at adj_list[v] and v gets added to the set at adj_list[u], but after printing out what gets added in the console, it seems that calling self.adj_list[u].add(v) adds v to every single set in the list.
Here's my constructor:
def __init__(self, n):
"""
Constructor
------------
Parameters:
n: int
number of nodes in the graph
------------
Attribute:
adj_list: list of int lists
int list at index i contains the neighbors of node i (nodes represented as ints
from 0 to n-1)
"""
# adjacency list of graph (entry at index i = list of neighbors of node i)
initialized with empty neighbor lists
self.adj_list = [set()] * n
And here's the add_edge function:
def addEdge(self, u, v):
"""
Adds an edge between vertices u & v
-----------
Parameters:
u: int
vertex at one end of edge
v: int
vertex at another end of edge
-----------
Returns:
void
"""
if u < 0 or u > len(self.adj_list) - 1 or v < 0 or v > len(self.adj_list) - 1:
raise Exception("Invalid start/end vertices for edge")
# add edge by adding to list of neighbors if it's not already in
if v not in self.adj_list[u] and u not in self.adj_list[v]:
# undirected graph -> add edge in both directions
print("before", self.adj_list)
self.adj_list[u].add(v)
print("one add", self.adj_list)
self.adj_list[v].add(u)
print("after", self.adj_list)
This is how I called the function:
if __name__ == "__main__":
g = Graph(3)
g.addEdge(0, 1)
This is what printed out:
before [set(), set(), set()]
one add [{1}, {1}, {1}]
after [{0, 1}, {0, 1}, {0, 1}]
Could anyone help me figure out why this is happening and how to fix it? Thank you.