I am a bit confused by how Python handles variable scope.
Below is a simple example that I created to explain the varialbe scope. When I execute the code below, the reference error surprisingly happens at the line print(x)
instead of x = x + 1
.
def foo():
print(x)
x = x + 1
x = 0
foo()
I would assume at the line print(x)
, Python should have access to the x in the global scope, no? (In fact, if we comment out the line x = x + 1
, the code actually runs)
How exactly does the Python intepreter work here?