I have a python package whose source looks like this
├── MANIFEST.in
├── README.rst
├── setup.cfg
├── setup.py
└── sqlemon
├── connection_strings.py
└── __init__.py
Most of the code is in __init__.py
, which has the following imports:
import os
import sqlemon.connections_strings as sqlcs
import yaml #This is the problem
If we run
python setup.py sdist
we see the following error
Traceback (most recent call last):
File "setup.py", line 2, in <module>
import sqlemon
File "/home/danielsank/src/sqlemon/sqlemon/__init__.py", line 4, in <module>
import yaml
ImportError: No module named yaml
This suggests that the virtualenv in which I work on my project must have all of the project's dependencies installed in order to do development.
I guess that's not unreasonable, but I'm not entirely sure what the workflow should look like because the project's dependencies are listed in setup.py
:
from distutils.core import setup
import sqlemon
version = sqlemon.__version__
project_name = sqlemon.__project_name__
setup(name=project_name,
# Irrelevant lines removed
install_requires=[
'sqlalchemy',
'alembic',
'pyyaml',
'sqlalchemy-schemadisplay'
],
)
I usually put requirements in requirements.txt
so the developer can do pip install -r requirements.txt
, but since the requirements are already in setup.py
that seems redundant.
Furthermore, after uploading my project to PyPI, when I try to pip install from pypi, the installation fails unless I already have pyyaml
installed in my virtualenv.
Obviously this is not the behavior we want; pyyaml
should install automatically as it is listed in the install_requires
list in setup.py
.
What is the recommended workflow for this situation?