I am trying to add a lambda function to the symbol table of a class:
class B:
def __init__(self):
func = lambda self, x: x*x # <-- Adding 'self' produces error
self.__dict__['test1'] = func
print(self.test1(2))
print(self.test2(2))
def test2(self, b):
return b*b*b
b = B()
but this produces an error (running the script with python t.py
):
Traceback (most recent call last):
File "./t.py", line 14, in <module>
b = B()
File "./t.py", line 8, in __init__
print(self.test1(2))
TypeError: <lambda>() missing 1 required positional argument: 'x'
However, if I remove self
as an argument to the Lambda function, it works fine.
Why isn't self
required as an argument for the Lambda function here?