I have a repo with Python code whose structure could be boiled down to this:
repo_root\
tool1\
tool1.py
tool1_aux_stuff.py
tool2\
tool2.py
tool2_aux_stuff.py
lib\
lib1\
lib1.py
lib1_aux_stuff.py
lib2\
lib2.py
lib2_aux_stuff.py
The following rules apply to the module usage:
- Any tool could use the modules from any library and from its own package, but not from a different tool's one.
- Any library could use the modules from any other library, and from its own package. Libraries never access the tool modules.
- There must be a way to invoke any tool from any working directory, including those outside
repo_root
.
The question is: how do I import the lib modules from the tool ones?
I know that if I add __init__.py
to each tool and lib directory and to the repo root, then I would be able to use absolute paths from the root, i.e. in tool1.py
I could write
import lib.lib1, lib.lib2.lib2_aux_stuff
However, if I execute tool1.py
from a random place, e.g.
machine_name: ~/random/place$ python /path/to/repo/tool1/tool1.py
I get the ModuleNotFoundError: No module named 'lib' found
error.
I am aware of a workaround which could be implemented using the PYTHONPATH env variable by augmenting it with an absolute path to repo_root
and supplying it to the invocation of the tool script, i.e.:
machine_name: ~/random/place$ PYTHONPATH=$PYTHONPATH:/path/to/repo python /path/to/repo/tool1/tool1.py
but I would really prefer something less clunky.
Any ideas how I could do it in a more straightforward way?