0

I am trying to write an hashable dictionary:

class HashableDict(dict):
    def __hash__(self):
        return hash(frozenset(self.items()))

However, this leads to an unexpected behaviour. If I create a set like this:

x = HashableDict({0: 1})
y = HashableDict({0: 2})
print(set((x, y)))

This, as expected, prints {{0: 1}, {0: 2}}. However, if I create the following set:

x = HashableDict({0: 1})
print(set(x))

Python3.9 prints {0}. This does not happen if I use {x} instead (i.e., {0: 1} is printed). To add to the mystery, the following code:

A = set(x)
A.add(y)
print(A)

Prints {0, {0: 2}}. So this only happens for the first element of the set...

What am I doing wrong? Note that I also tried adding a __eq__ method that works consistently with __hash__, but this does not solve it:

def __eq__(self, other):
    return frozenset(self.items()) == frozenset(other.items())
mkrieger1
  • 19,194
  • 5
  • 54
  • 65
olinarr
  • 261
  • 3
  • 13

1 Answers1

2

The set constructor takes an iterable as argument and creates a set of all the iterable's elements. Since iterating a dict only iterates the keys, you get a set of its keys:

set(x) # takes x and turns it into a set!

You, however, want a set containing x:

set([x])  # takes a list containing x and turns it into a set
# {x}  # is the equivalent

Note that {x} and set(x) are not the same, reagardless of the type of x!

Also see What is the difference between a list of a single iterable `list(x)` vs `[x]`? and many other duplicates concerned with this "collection literal vs. constructor".

user2390182
  • 72,016
  • 6
  • 67
  • 89