Some time ago it was possible to define a single requirements.txt
or similar containing both specs for PyPI packages and links to repositories and archives.
That demanded to parse the requirements.txt
and split them into "requirements" and "dependencies", where "requirements" would contain definitions of PyPI packages and "dependencies" — links.
Setuptools has different args for setup()
for these: install_requires
and dependency_links
.
And it really worked: one was able to define a requirements.txt
and install a package both as python setup.py install
and as pip install .
. Moreover, it was possible to install just dependencies via pip install -r requirements.txt
. All ways worked and allowed to have a single place to define all requirements including non-PyPI links.
However, support of dependency_links
arg was dropped by pip
since v19. And here's the weird part: it is not dropped by setuptools
. But there's more.
As of today, pip
:
- Supports only
install_requires
.
- Prefers PEP 508 notation for dependencies in definitions of packages (
install_requires
) and standalone requirements.txt
or similar.
- Aborts installation of packages, which contain links in their
install_requires
.
Your definitions of dependencies mix 2 notations: prefixes like keras-contrib@
are from PEP 508 and #egg=
parts are from setuptools
links notation.
This is not an issue: pip
will ignore "eggs" as names are already defined before @
.
I believe the installation of the package via pip
works fine, i.e.:
pip install .
However, the issues will arise if the package is installed via setuptools
, i.e.:
python setup.py install
setuptools
does not understand PEP 508 notation and ignores links in install_requires
. As of today, to make setuptools
following links, both install_requires
and dependency_links
have to be used, e.g.:
setup(
...
install_requires=[
...
"keras_contrib==2.0.8",
...
],
dependency_links=[
"https://github.com/keras-team/keras-contrib/tarball/master#egg=keras_contrib-2.0.8",
...
],
)
Here are several tricky points:
- A single dependency is defined in 2 places: a package name in
install_requires
and a link in dependency_links
to resolve the package dependency.
- The link is not
git+https://.../....git
, but it's a link to an archive: https://.../tarball/...
.
- Egg name is in
snake_case
, not in dash-case
. While it's possible to use dash-case
, this will not allow specifying the version.
- Version in
install_requires
is delimited via ==
and in dependency_links
— via -
.
- It's possible to omit the version. But the only viable use case for that is if the package is not present in PyPI and is rarely updated. If the package is present in PyPI, but an unpublished version is needed, then the version must be specified.
And here's the bummer: fixing links for setuptools
will break pip
, as PEP 508 does not allow to specify versions. Keeping keras-contrib==x.y.z @ ...
in install_requires
will make pip
to search for the package keras-contrib==x.y.z
, where ==x.y.z
is not a version, but a part of the name. At the same time, not specifying a version will make setuptools
to grab the latest version available at PyPI, not at the link from dependency_links
.
In your case neither keras-contrib
nor en-core-web-sm
are present at PyPI, so using keras_contrib@git+https://...
+ dependency_links
without version specified might work.
Otherwise, stick to pip install .
and avoid using python setup.py install
if the package depends on links.
See also:
Trivia: several issues on GitHub are still open and PEP 508 is still in Active
state since 2015. Digging around source code would reveal that setuptools is a wrapper around Python's distutils. setuptools
is not a part of Python's stdlib, but the docs of distutils
imply stdlib docs will be removed after docs of setuptools
will be updated. At the same time pip
is already bundled with Python's installations as Python's module. And yet we have pipfile
s, pipenv
, poetry
, conda
, pipx
, pip-tools
, shiv
, spack
, and the rest. Looks a bit overwhelming.