I've reviewed the information here: How can I make setuptools (or distribute) install a package from the local file system
It looks like that question was posted a very long time ago and I'm hoping that in the seven years since it was posted that there have been some new developments to managing dependencies in the Python world.
Specifically, in our case we are working on a repository of related GCP packages:
src/
airflow/
__init__.py
dags/
__init__.py
requirements.txt
dag1.py
libs/
__init__.py
utils.py
tests/
dags/
test_dag1.py
plugins/
dataflow/
__init__.py
setup.py
dataflow1.py
libs/
__init__.py
utils.py
cli_helper/
__init__.py
cli_command.py
libs/
__init__.py
util.py
shared_utils/
util1.py
I have found myself repeating the same bits of helper functions within the context of each package and would like to put those helper functions in one place and then have a linked copy of the shared_utils files either in a shared_utils folder under each package or even to just have a copy of util1.py
placed under the existing libs
directory for each package.
What is most "pythonic" way to accomplish this?
Right now it seems that my only options would be to:
- Use requirements.txt as listed above where I can and use a custom command in setup.py where requirements.txt can't be used.
- Create a os level link from the shared_utils directory into each package such that it appears that the directory exists natively in each of my packages.
- Package my shared_utils and then install directly from git. Though this option again requires requirements.txt and in some of my deployment environments, I can't rely on requirements.txt, I have to run everything through setup.py