2

All:

def a(p): 
    return p+1 

def gen(func, k=100): 
    l= [] 
    for x in range(k): 
       temp = ("%s_with_parameter_%s" %(func.__name__, x), lambda: func(x)) 
       # maybe this will be more clear to explain my quetion: 
       # i want to get list/dict which can bind self-defined string and function together 
       l.append(temp) 
    return l 

l = gen(a, 100) 

for x in range(len(l)): 
   l[x][1]()                

100
100
100
100
100
100
100
100

...I suppose output will be a 1 to 101 print out, but it shows a 100 list.

May I get help for this snippet here?

Thanks!

user478514
  • 3,859
  • 10
  • 33
  • 42

5 Answers5

6

As other answers have noted, the lambdas are using the last value of x because they're closed over it and so see any changes made to it. The trick is to bind them to the value.

You can do this by writing them as

lambda x=x: func(x)

This binds the value of x that is current when the lambda is created to the default parameter p which means that the lambda is no longer a closure over x and is not affected by any future changes in its value. You would change the way you call it to not pass in an argument that you don't do anything with:

for x in range(len(l)):
   l[x][1]()

Now, the lambda uses the default value which is bound to what you want.

If you do actually need to pass a value in, then you can just stick the default parameter that is used for binding purposes on after the 'real' parameter:

lambda p, x=x: func(p, x)
aaronasterling
  • 68,820
  • 20
  • 127
  • 125
  • Hmm. Why do you have to bind the value explicitly? And under which circumstances is this necessary? – Konrad Rudolph Nov 08 '10 at 11:35
  • @Konrad Rudolph. You have to bind the value to a new variable explicitly so that python doesn't close the lambda over `x`. You would do this anytime that you want to create a function that has a variable bound to the value that a variable references *at the time of creation*. What we're doing is creating a local variable for the function to access. – aaronasterling Nov 08 '10 at 11:42
  • thank you very much! works now! btw, may I know if `functools.partial` can solve this problem either? `lambda p, x=x: func(p, x)` and `lambda x=x: func(x)`? :p because if x is very big it may consume too much memory. maybe `functools.partial` can solve this problem? – user478514 Nov 08 '10 at 11:43
  • @user, while functools partial will solve the problem as well as this one, I don't think that it will cut down the memory use. – aaronasterling Nov 08 '10 at 11:50
3

The standard trick is to write your function gen() as:

def gen(func, k=100):
    l= []
    for x in range(k):
       l.append(lambda x=x: func(x))
    return l

Note the parameter to the lambda expression. This enforces that a new x is created for each lambda. Otherwise, all use the same x from the enclosing scope.

Sven Marnach
  • 574,206
  • 118
  • 941
  • 841
2

Maybe with this slight variant of your code you'll understand what happens.

def a(p):
    return p+1

def gen(func, k=100):
    l= []
    for x in range(k):
       l.append(lambda p: func(x))
    x = 77
    return l

l = gen(a, 100)

for x in range(len(l)):
   print l[x](10)

Now you always get 78 when calling lx. The problem is you always have the same variable x in the closure, not it's value at the moment of the definition of the lambda.

I believe you probably want something like Haskell curryfication. In python you can use functools.partial for that purpose. I would write your code as below. Notice I have removed the dummy parameter.

import functools

def a(p):
    return p+1

def gen(func, k=100):
    l= []
    for x in range(k):
        l.append(functools.partial(func, x))
    return l

l = gen(a, 100)

for x in range(len(l)):
   print l[x]()

Now let's introduce back the dummy parameter:

import functools

def a(p):
    return p+1

def gen(func, k=100):
    fn = lambda x, p: func(x)
    l= []
    for x in range(k):
        l.append(functools.partial(fn, x))
    return l

l = gen(a, 100)

for x in range(len(l)):
   print l[x](10)

Edit: I much prefer the solution of Sven Marnach over mine :-)

kriss
  • 23,497
  • 17
  • 97
  • 116
  • But that’s a bug in Python, right? The iteration variable *should* by rights be closed over. After all, the code *works* when you first copy the iteration variable `x` into another variable and use *that8 in the lambda. – Konrad Rudolph Nov 08 '10 at 11:20
  • Not a bug. The lambda is not called when it is defined, hence it's always the same x (the one with the last value) that is used in the end. – kriss Nov 08 '10 at 11:26
  • @kriss: The time of invocation is irrelevant. Python creates *closures* for lambdas. Why doesn’t it do this here? Why doesn’t it close over the variable `x`, i.e. capturing its *current* meaning? Python does this in different but similar situations. I acknowledge that this isn’t a bug (since apparently it’s documented) but it’s inconsistent and bad. How *do* you get Python to create a closure here? – Konrad Rudolph Nov 08 '10 at 11:34
  • @aaronasterling: the problem is *not* to avoid the closure – we *want* the closure, otherwise calling the lambda later wouldn’t have *any* value for the `x` inside the lambda. But I understand the behaviour now – apparently Python, unlike other languages, creates a *lexical* closure. Hmm. – Konrad Rudolph Nov 08 '10 at 11:41
  • @Konrad, the problem is most definitely to avoid the closure and give the function a *local* variable to work with. What languages have anything other than lexical closure? That's the only kind I'm aware of. – aaronasterling Nov 08 '10 at 11:44
  • @aaronasterling: Perl, for one. See http://stackoverflow.com/questions/233673/lexical-closures-in-python. – Konrad Rudolph Nov 08 '10 at 11:46
  • deleted my earlier comment that @Konrad was replying to because I changed my mind about a derogatory comment I made about the solution. – aaronasterling Nov 08 '10 at 11:46
  • @Konrad Rudolph: python behavior is indeed surprising here. Some could say python closures are not real closures. – kriss Nov 08 '10 at 11:50
  • @Kriss, the only reason that python closures aren't "real" closures is because a closed over variable can't be written to. This is fixed with the new `nonlocal` keyword`. One could say (and I would agree) that python doesn't have real lambdas because they are limited to a single expression but that's another story. – aaronasterling Nov 08 '10 at 11:52
  • @aaronsterling: yes, you are absolutely right. But I would say that the problem with closed over variables is not that you can't write to them, but that you can but shouldn't. – kriss Nov 08 '10 at 12:00
  • @kriss, thanks, I was asking `functools.partial` from aaronasterling then read your solution, I suppose on this solution 'lambda x=x: func(x)' the x is kept a copy in memory, for big x it may have 100 copies in my case, and may I know if `functools.partial` or other solution can avoid this memory consume problem? – user478514 Nov 08 '10 at 12:00
  • @user478514: sadly, I don't think so. I was not aware of the solution from Sven, but it's probably much better than mine. – kriss Nov 08 '10 at 12:04
0

In your code, it looks like the value used in a() is the last value known for x (which is 100) while you're p variable is actually dummy (like you mention). This is because the lambda function is "resolved" when you call it.

It means that calling l[x](10) would be "equivalent" to: (this is not correct, it's to explain)

lambda 10: func(100)
Antoine Pelisse
  • 12,871
  • 4
  • 34
  • 34
  • -1. What do you mean "resolved when you call it"? The function is create as soon as the statement is executed. The function is closed over `x` and maintains a reference to it. Nothing happens with either the function, the reference or the value being modified in any way when the function is called. It all happens at function creation time. – aaronasterling Nov 08 '10 at 11:34
0

You can use the yield statement to create a generator, I think is the most elegant solution:

>>> def a(p):
...     return p+1
...
>>> def gen(func, k=100):
...     for x in range(k):
...         yield lambda :func(x)
...
>>> for item in gen(a, 100):
...     item()
...
1
2
3
4
(...)
100
>>>

But as you can see it goes only until 100 because of the range function:

>>> range(100)
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 2
2, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 4
2, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 6
2, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 8
2, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99]
>>>

You can use gen(a, 101) to solve this:

>>> for item in gen(a, 101):
...     item()
...
1
2
3
4
5
(...)
101
>>>
Magnun Leno
  • 2,728
  • 20
  • 29
  • -1 try `list(gen(a, 101))`. This method will have the same problem as the original code. It works in the one specialized instance shown but will, for that reason, be fragile. – aaronasterling Nov 08 '10 at 12:10
  • thanks, `generator` also works, and I am sure it is a better solution, but I need the func list for some other reason here, For example, if I need to use the first string I keeped then after all i need to find a way to store them in a list avoid to loop the generator each time when I need it. – user478514 Nov 08 '10 at 12:13