Why does python have the following behaviour?
a = 4
def f1(): # No error: Returns 5. Global a unchanged
a = 5
return a
def f2(): # No error. Returns value of global a
return a
def f3(): # Error: a used before assignment
a = a + 1
return a
def f4(): # Error: a used before assignment
b = a
a = b + 1
return a
def f5(): # No error. Returns value of global a
b = a
return b
It seems like that Python sometimes permit the use of a global variable a
and sometimes doesn't. What rule guides this?