0

The first program works, when the variable var is defined outside the function f. The value 232 is outputted successfully. In the second program, var is again defined outside the function f, but I get an error saying that var is not defined when the print statement in the function is run. Why does the first program work, but the second program doesn't work?

First program:

var = 232
def f():
    print (var)

f()

The second program:

var = 232
def f():
    print (var)
    var +=1

f()

The output of the first program is expected - it outputs the value 232. The output of the second program should be 232, and it then increments var, but a local variable error occurs at the print statement at line 3.

1 Answers1

0

If you want ot mess with a global variable in a function you'll need to tell the function it is dealing with a global variable:

var = 232
def f():
    global var
    print (var)
    var +=1
    print (var)

f()

OUTPUT

232
233
Hoog
  • 2,280
  • 1
  • 14
  • 20
  • So if you're accessing or just printing a variable within a function, you don't need to define it as global (like the first program) but if you're manipulating it (like the second program), you do need to define it as global? – Danny Berlin Apr 26 '19 at 15:15
  • That's it! Values outside of your function are safe from being manipulated in your function unless they are passed in or you declare a global variable to make you function look outside itself. – Hoog Apr 26 '19 at 16:07