In Question 2918898, users discussed how to avoid caching because modules were changing, and solutions focused on reloading. My question is somewhat different; I want to avoid caching in the first place.
My application runs on Un*x and lives in /usr/local
. It imports a
module with some shared code used by this application and another.
It's normally run as an ordinary user, and Python doesn't cache the
module in that case, because it doesn't have write permission for that
system directory. All good so far.
However, I sometimes need to run the application as superuser, and then it does have write permission and it does cache it, leaving unsightly footprints in a system directory. Do not want.
So ... any way to tell CPython 3.2 (or later, I'm willing to upgrade) not to cache the module? Or some other way to solve the problem? Changing the directory permissions doesn't work; root can still write, root is all-powerful.
I looked through PEP 3147 but didn't see a way to prevent caching.
I don't recall any way to import code other than import
. I suppose I
could read a simple text file and exec
it, but that seems inelegant
and bug-prone.
The run-as-root is accomplished by calling the program with sudo
in a
shell script, and I can have the shell script delete the cache after the
run, but I'm hoping for something more elegant that doesn't change the
directory's last-modified timestamp.
Implemented solution, based on Wander Nauta's answer:
Since I run the executable as a plain filename, not as python executablename
, I went with the environment variable. First, the
sudoers
file needs to be changed to allow setting environment
variables:
tom ALL=(ALL) SETENV: NOPASSWD: /usr/local/bkup/bin/mkbkup
Then, the invocation needs to include the variable:
/usr/bin/sudo PYTHONDONTWRITEBYTECODE=true /usr/local/bkup/bin/mkbkup "$@"