I wanted to make a probability-like class for practice, so I constructed a class P and wanted to be able to have an associated value with it. I also wanted to be able to add probabilities like P("a") + P("b") and have it add their values. This was fine to code, but I got some strange behavior while testing. I've pasted only the relevant parts of the code below [which is why it might seem a bit verbose]:
class P:
def __init__(self, event):
self.event = event
self.v = 0
def value(self, val):
"""Sets the probability to the value 'val'."""
self.v = val
def add_stuff(x,y):
return lambda x,y: x+y
def __add__(self, other):
if isinstance(other, P): # if we are adding two P's together.
return add_stuff(self.v, other.v)
else: # if we are adding a number to our P.
try: return add_stuff(self.v, other)
except: raise TypeError(self.type_error_string)
a = P("a") # Creates the instances.
b = P("b") #
c = P("c") #
a.value(0.5) # Sets the value of a.v to 0.5,
b.value(0.1) # and so on for b and c.
c.value(0.2) #
print a.v + b.v == 0.7. # prints True.
print b.v == 0.1 # prints True.
print c.v == 0.2 # prints True.
print b.v + c.v # prints 0.3.
print type(b.v + c.v) # prints <float>
print b.v + c.v == 0.3 # prints False (!!).
The relevant part here is the bottom. Note that a.v + b.v [as well as some other values] were fine when testing, but not b.v + c.v for some reason. I'm not sure what is happening here.