1

In the following:

a = 3
def b():
    # global a; // will error without this statement
    a = a + 1
    print a

It will error unless I add a global. It seems in this sense that python evaluates the LHS of the expression first (creating a new local for a) rather than the RHS first (which equals 4) and then assigning it to the local a. In other words, like:

local a <-- local a + 1
             ^
            doesnt exist so look up in parent environment
local a <-- global a + 1
local a <--    3     + 1

I'm just curious why that approach isn't used in python as the default. For example, in C it uses this pattern:

// file.c
#include <stdio.h>

int a=3;
int main(void)
{
    a = a + 1;
    printf("%d\n", a);
}
$ gcc file.c -o file; ./file
4
carl.hiass
  • 1,526
  • 1
  • 6
  • 26
  • " It seems in this sense that python evaluates the LHS of the expression first (creating a new local for a) rather than the RHS first (which equals 4) " no, what happens is the *compiler* marks `a` as local, therefore when you reference `a` in the expression `a += 1` it will raise an error, because that local variable isn't defined. Defaulting to assigning to a global variable would be bad. Python doesn't have variable declarations like C, so it uses a simple rule to resolve this ambiguity. Note, a language with the opposite behavior is JS, and the default global is considered a huge problem. – juanpa.arrivillaga Mar 24 '21 at 21:01
  • Note, there is one case in Python where you can do something like `a = a + 1` where `a` assigns to the local scope and uses `a` from the global scope (if it exists), and that is in class definitions, so inside, say a `class Foo: ...` if you do `a = a + 1` it will assign to the `a` local to the class body, but it will use the `global a` for `a + 1` (of no other `a` has been assigned in the class block) – juanpa.arrivillaga Mar 24 '21 at 21:06

1 Answers1

1

There is no official explanation, but I can think of two reasons

  • Since a = a + 1 is an assigment, it refers to the local variable a, not the global one (unless otherwise specified). Since you have not declared a local a, it is implicitly defined, but not initialized (something similar happens in javascript too, and is also a common source of confusion). In C you would not have that misunderstanding, it's a static language, you would have defined a local int a if it existed.
  • In python you could have defined a function c() inside function b(), which would bind to the a variable inside b, not the global a. C doesn't have closures, so this is not useful.
blue_note
  • 27,712
  • 9
  • 72
  • 90
  • that's a great point about C being a static language. I think that's probably why `a=a+1` may be ambiguous in python. – carl.hiass Mar 24 '21 at 20:16
  • @carl.hiass: probably, because it has to both be read globally (on the right) and written locally (on the left). Since the same name cant bind to two different variables, there is ambiguity indeed. – blue_note Mar 24 '21 at 20:19
  • "Since you have not declared a local `a`"... *Python doesn't have variable declarations*. – juanpa.arrivillaga Mar 24 '21 at 21:02
  • Yes, dynamic languages don't commonly use declarations in the C sense, but the name binding still occurs on the first assignment. This has not happened here, the first assignment both reads and writes the variable, hence the ambiguity – blue_note Mar 24 '21 at 21:11