The problem is only when using a Decimal with a complex number. Float's, int's etc. work fine as they are coerced for comparison.
In python3's decimal lib comparing to a complex number is handled specifically in _convert_for_comparison:
# Comparisons with float and complex types. == and != comparisons
# with complex numbers should succeed, returning either True or False
# as appropriate. Other comparisons return NotImplemented.
if equality_op and isinstance(other, _numbers.Complex) and other.imag == 0:
other = other.real
In python2 the convert function implementation is:
def _convert_other(other, raiseit=False, allow_float=False):
"""Convert other to Decimal.
Verifies that it's ok to use in an implicit construction.
If allow_float is true, allow conversion from float; this
is used in the comparison methods (__eq__ and friends).
"""
if isinstance(other, Decimal):
return other
if isinstance(other, (int, long)):
return Decimal(other)
if allow_float and isinstance(other, float):
return Decimal.from_float(other)
if raiseit:
raise TypeError("Unable to convert %s to Decimal" % other)
return NotImplemented
Why len(set([1, Decimal('1'), (1+0j)])) == 2
is because the Decimal is compared to the int first, if you change the ordering you get different output:
In [23]: {1, Decimal('1'), (1+0j)}
Out[23]: {(1+0j), Decimal('1')}
In [24]: {Decimal('1'),1, (1+0j)}
Out[24]: {(1+0j), Decimal('1')}
In [25]: {Decimal('1'), (1+0j), 1}
Out[25]: {1}
Also using a literal is better as the order you insert matches the docs, as in the last value is preserved using a literal over the set(..).