0

I am having some strange behaviour when I have a list of lambda functions which evaluate theano expressions. The code is below:

# Equivalent functions (or at least I assume so)
def tilted_loss(q,y,f):
    e = (y-f)
    return (q*tt.sum(e)-tt.sum(e[(e<0).nonzero()]))/e.shape[0]

def tilted_loss2(y,f):
    q = 0.05
    e = (y-f)
    return (q*tt.sum(e)-tt.sum(e[(e<0).nonzero()]))/e.shape[0]

def tilted_loss_np(q,y,f):
    e = (y-f)
    return (q*sum(e)-sum(e[e<0]))/e.shape[0]

# lambda functions which uses above functions
qs = np.arange(0.05,1,0.05)
q_loss_f = [lambda y,f: tilted_loss(q,y,f) for q in qs]
q_loss_f2 = lambda y,f:tilted_loss(0.05,y,f)
q_loss_f3 = lambda y,f:tilted_loss(qs[0],y,f)

# Test the functions
np.random.seed(1)
a = np.random.randn(1000,1)
b = np.random.randn(1000,1)
print(q_loss_f[0](a,b).eval())
print(q_loss_f2(a,b).eval())
print(q_loss_f3(a,b).eval())
print(tilted_loss2(a,b).eval())
print(tilted_loss_np(qs[0],a,b)[0])

This gives the output:

0.571973847658054
0.5616355181780912
0.5616355181695327
0.5616355181780912
0.56163551817
  1. I must be doing something wrong with the way that the list of functions q_loss_f is defined.
  2. Is the way that q is defined ok? i.e. its a numpy variable that I'm sending in, but this seems to be fine in q_loss_f3.

Any thoughts?

sachinruk
  • 9,571
  • 12
  • 55
  • 86
  • 1
    Possible duplicate of [What do (lambda) function closures capture in Python?](http://stackoverflow.com/questions/2295290/what-do-lambda-function-closures-capture-in-python) – Ilja Everilä Jul 22 '16 at 07:43

1 Answers1

1

Is a common error, the q value in the lambda expresion will just take the last value from the comprehension loop, you better use partial:

q_loss_f = [partial(tilted_loss, q=q) for q in qs]
Netwave
  • 40,134
  • 6
  • 50
  • 93
  • In my particular case I cannot use partial because for some reason q_loss_f does not have the attribute __name__ as all `def` functions do. Anyway, solved it by `q_loss_f = [lambda y,f,q=q: tilted_loss(q,y,f) for q in qs]` – sachinruk Jul 22 '16 at 07:15