0

I got this code on python.org

class A:
    a = 42
    b = list(a + i for i in range(10))

Here defined a = 42 and in the loop a + i is done but when i see the result it returns in b is.

A.b  
[10, 11, 12, 13, 14, 15, 16, 17, 18, 19]  

I tried to understand it but couldn't. Can someone please reply how this code is working. Not defining a = 42 doesn't make any error. Then how i is being added to a when a is not defined

n.imp
  • 787
  • 4
  • 11
  • 29
  • Unless you *also* have a global variable `a = 0` somewhere te code you posted will throw a `NameError` exception. See the duplicate as to why that is. – Martijn Pieters Jul 16 '15 at 13:18
  • Not defining a = 42 doesn't make any error. Then how i is being added to a when a is not defined – n.imp Jul 16 '15 at 13:24
  • That is what I am saying, you have a *global* `a` with value `0`. – Martijn Pieters Jul 16 '15 at 13:27
  • O yes, got it. a was defined globally. but again the question is that i defined a = 42 but in the next line it is giving error that global a is not defined. but local a should be available there. – n.imp Jul 16 '15 at 13:38
  • I closed your post as a duplicate half an hour ago. :-) Go read the post I linked you to. – Martijn Pieters Jul 16 '15 at 13:45

0 Answers0