1

Let's say, I write the file testFunc.py, containing

def helloWorld():
    print['hello world']

When I run inside the python interpreter (v. 3.5.2)

>>> import testFunc
>>> testFunc.helloWorld()

as expected I get the error message

... error ... line 2, in helloWorld
print['hello world']

If I correct the code to

def helloWorld():
    print('hello world')

and rerun only the command without closing python in the meanwhile, I get the following error message:

>>> testFunc.helloWorld()

... error ... line 2, in helloWorld
print('hello world')

So I get an error message, even though the code appearing inside the error message would be correct. Apparently Python stored the binary of the function somewhere and didn't recompile it. On the other hand, Python reads out the new source file, which doesn't contain the error anymore.

So why does Python do this? I find this behavior highly misleading. Why does Python not compare the time stamps of the source code and the temporary binary and recompile the source if it has been edited since? Is it because another instance of Python may be currently executing the temporary binary?

jonrsharpe
  • 115,751
  • 26
  • 228
  • 437
marco
  • 243
  • 1
  • 9
  • 4
    Because module imports are cached. Modules are loaded just **once**, because otherwise your Python programs would be slow and Python modules could not store any globals that they wanted to reuse. – Martijn Pieters Apr 16 '18 at 07:24
  • Related: https://bugs.python.org/issue8087 – jonrsharpe Apr 16 '18 at 07:25
  • Can you imagine the performance overhead if python had to check date/timestamps every time a module was accessed? – cdarke Apr 16 '18 at 07:25
  • Thanks for the fast reply. This sounds reasonable. But wouldn't there be a way to update the code shown in the error message? Or at least state, that the code may have changed? Because if I get back an error message, speed isn't that much of an issue. – marco Apr 16 '18 at 07:37

0 Answers0