I was wondering if someone could explain this behavior to me
x = 10
def foo1():
y = x + 1
print(y) #This works, because x exists
def foo2():
y = x + 1 #This line throws an exception even if the value exists
print(y)
x = y
This throws a UnboundLocalError: local variable 'x' referenced before assignment
. I understand that I could declare x as global at the beginning of foo, but I'm curious where in the language this is defined. I would expect that the first line would read the global x and the second line would write the local x. Why is this not the case?