This doesn't have to do with PyTorch specifically. Python assumes any assignment within a local scope refers to a local variable unless the variable is explicitly declared global
in that scope. A similar question: Why does this UnboundLocalError occur (closure)?
For your particular question, the problem is that x
is defined only in the global scope, so you can't assign a new value to x
without declaring it global
. On the other hand, x.data
refers to an attribute of x
, the attribute itself is not a global, so you can assign it without using the global
keyword.
As an example, consider the following code
class Foo():
def __init__(self):
self.data = 1
x = Foo()
def f():
x.data += 1
f()
print(x.data) # 2
This code will update x.data
as expected since x.data
is not a global variable.
On the other hand
class Foo():
def __init__(self):
self.data
def __iadd__(self, v)
self.data += v
return self
x = Foo()
def f():
x += 1 # UnboundLocalError
f()
print(x.data)
will raise an UnboundLocalError
because x += 1
is interpreted by the python compiler as an assignment to x
, therefore x
must refer to a local variable. Since a local x
hasn't been declared prior to this you get an exception.
In order for the previous code to work we need to explicitly declare x
to be global within the function's scope.
class Foo():
def __init__(self):
self.data
def __iadd__(self, v)
self.data += v
return self
x = Foo()
def f():
global x # tell python that x refers to a global variable
x += 1
f()
print(x.data) # 2