I have read a lot of 'how-to' articles on Python imports (and related SO questions), however I'm struggling to figure out what the 'best practice' is for managing imports in a large Python project. For example, say I have a project structure like the below (this is an oversimplification):
test/
packA/
subA/
__init__.py
sa1.py
sa2.py
__init__.py
a1.py
a2.py
packB/
b1.py
b2.py
main.py
and say inside packA/subA/sa1.py
I want to import code from packB/b1.py
. More generally I want to be able to freely import between packages/subpackages inside the project.
Based on my current understanding, there are four ways to go about doing this:
Option 1
Add my project root to the PYTHONPATH permanently and use absolute imports everywhere in the project. So inside packA/subA/sa1.py I would have
from packB import b1
This could get a bit messy as the project tree gets larger e.g.
from packB.subC.subD.subE import f1
Option 2
Same as above, but instead of modifying PYTHONPATH to include the project root, I simply insist on only executing python from the project root (so that the root is always the working directory).
Option 3
Use relative imports
from ...packB import b1
I don't like this as it's not easy to read and everywhere I've read generally says relative imports are a bad idea.
Option 4
Use setuptools/setup.py script and install my packages using pip so I can import everywhere.
This seems like overkill since all of the code is already in the project folder (and would have to re-install each time a package changes) and could also cause headaches with dependency/version management.
So my question is, which of the above (if any) is considered best practice? I'm undecided between 1 and 2 at the moment, but would be great to hear of a more elegant/Pythonic approach.
Note: I am using Python 3.6