I am trying to write an hashable dictionary:
class HashableDict(dict):
def __hash__(self):
return hash(frozenset(self.items()))
However, this leads to an unexpected behaviour. If I create a set like this:
x = HashableDict({0: 1})
y = HashableDict({0: 2})
print(set((x, y)))
This, as expected, prints {{0: 1}, {0: 2}}
. However, if I create the following set:
x = HashableDict({0: 1})
print(set(x))
Python3.9 prints {0}
. This does not happen if I use {x}
instead (i.e., {0: 1}
is printed). To add to the mystery, the following code:
A = set(x)
A.add(y)
print(A)
Prints {0, {0: 2}}
. So this only happens for the first element of the set...
What am I doing wrong? Note that I also tried adding a __eq__
method that works consistently with __hash__
, but this does not solve it:
def __eq__(self, other):
return frozenset(self.items()) == frozenset(other.items())