python relies on the __class__
variable to be in a cell
for a super()
call. It gets this cell from the free
variables in the first stack frame.
The odd thing is though that this variable isn't in locals()
, and it is when you just reference it from the __init__
method.
Take for example this bit of code:
class LogicGate:
def __init__(self,n):
print(locals())
a = __class__
print(locals())
When you disassemble this you can see it somehow knows that print
and locals
are globals and __class__
is a LOAD_DEREF
. How does the compiler know this, before running the code. locals
, print
and __class__
are just variable names to the compiler as far as I know. Also this way __class__
is all of a sudden in the locals()
even before it's copied into a
.
4 10 LOAD_DEREF 0 (__class__)
while locals
:
2 LOAD_GLOBAL 1 (locals)
I'm asking because I'm working on skulpt a python to javascript compiler. And currently that compiler doesn't differentiate between print
or __class__
and attempts to get them both from the global scope.
As you can see from a printout of the ast of the above bit of code, the parser doesn't differentiate between locals
or __class__
:
Module(body=[ClassDef(name='LogicGate',
bases=[],
keywords=[],
body=[FunctionDef(name='__init__',
args=arguments(args=[arg(arg='self',
annotation=None),
arg(arg='n',
annotation=None)],
vararg=None,
kwonlyargs=[],
kw_defaults=[],
kwarg=None,
defaults=[]),
body=[Expr(value=Call(func=Name(id='print',
ctx=Load()),
# here's the load for locals
args=[Call(func=Name(id='locals',
ctx=Load()),
args=[],
keywords=[])],
keywords=[])),
Assign(targets=[Name(id='a',
ctx=Store())],
# here's the load for __class__
value=Name(id='__class__',
ctx=Load())),
Expr(value=Call(func=Name(id='print',
ctx=Load()),
args=[Call(func=Name(id='locals',
ctx=Load()),
args=[],
keywords=[])],
keywords=[]))],
decorator_list=[],
returns=None)],
decorator_list=[])])