Consider the following code:
def f():
y = x + 1
x = 1
f()
This code runs fine. As I understand it, when the function is called, python searches for the name x
first in the local scope, and since it doesn't find it, it searches the global scope, reads the value of x
and execution of code continues from there. So far, so good.
But the following piece of code throws an UnboundLocalError: local variable 'x' referenced before assignment
:
def g():
y = x + 1
x = y
x = 1
g()
So my guess is that python decided that x
inside the g
definition is a local variable. But when does this decision happen?
I guess it doesn't happen during code execution (when g
is called in the last line). Because then python would read the first line inside the definition of g
and interpret it like it happens in f
, as a reference to the global variable x
. Then it would read the second line in the definition of g
, and it would create a local variable x
. That would mean that inside the definition of g, the same name x
would mean global and local at different lines, which would be a nightmare!
I'm wondering if the decision to bind x
to the local namespace is taken while the def
statement is parsed. What confuses me though is that the error is thrown only when the function is called (so during execution) and not after the def
statement, when the function definition has been parsed.