In the following:
a = 3
def b():
# global a; // will error without this statement
a = a + 1
print a
It will error unless I add a global
. It seems in this sense that python evaluates the LHS of the expression first (creating a new local for a
) rather than the RHS first (which equals 4) and then assigning it to the local a. In other words, like:
local a <-- local a + 1
^
doesnt exist so look up in parent environment
local a <-- global a + 1
local a <-- 3 + 1
I'm just curious why that approach isn't used in python as the default. For example, in C it uses this pattern:
// file.c
#include <stdio.h>
int a=3;
int main(void)
{
a = a + 1;
printf("%d\n", a);
}
$ gcc file.c -o file; ./file
4