Consider the following code:
class Foo:
def __mul__(self,other):
return other/0
x = Foo()
x.__mul__ = lambda other:other*0.5
print(x.__mul__(5))
print(x*5)
In Python2 (with from future import print
), this outputs
2.5
2.5
In Python3, this outputs
2.5
---------------------------------------------------------------------------
ZeroDivisionError Traceback (most recent call last)
<ipython-input-1-36322c94fe3a> in <module>()
5 x.__mul__ = lambda other:other*0.5
6 print(x.__mul__(5))
----> 7 print(x*5)
<ipython-input-1-36322c94fe3a> in __mul__(self, other)
1 class Foo:
2 def __mul__(self,other):
----> 3 return other/0
4 x = Foo()
5 x.__mul__ = lambda other:other*0.5
ZeroDivisionError: division by zero
I ran into this situation when I was trying to implement a type that supported a subset of algebraic operations. For one instance, I needed to modify the multiplication function for laziness: some computation must be deferred until the instance is multiplied with another variable. The monkey patch worked in Python 2, but I noticed it failed in 3.
Why does this happen? Is there any way to get more flexible operator overloading in Python3?