2

accordingly to my research the following should work:

from setuptools import setup
from setuptools import find_packages
...
REQUIRES_INSTALL = [
    'spacy==2.3.2',
    'tensorflow==1.14.0',
    'Keras==2.2.4',
    'keras-contrib@git+https://github.com/keras-team/keras-contrib.git#egg=keras-contrib',
    'en-core-web-sm@https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz#egg=en-core-web-sm'
]
...
setup(
    name=NAME,
    version=VERSION,
    description=DESCRIPTION,
    install_requires=REQUIRES_INSTALL,
    ...
)

When building a wheel or egg, everything is fine: python setup.py bdist_wheel.

But when trying to install the package (whl or egg) with pip install -U dist/mypack-....whl.

I get:

ERROR: Could not find a version that satisfies the requirement keras-contrib (from mypack==0.3.5) (from versions: none)
ERROR: No matching distribution found for keras-contrib (from mypack==0.3.5)
...
ERROR: Could not find a version that satisfies the requirement en-core-web-sm (from mypack==0.3.5) (from versions: none)
ERROR: No matching distribution found for en-core-web-sm (from mypack==0.3.5)

I have tried to same via setup.cfg but still no luck.


As reference - all these dependency are working when installing them first from requirments.txt and then installing the wheel.

spacy==2.3.2
tensorflow==1.14.0
Keras==2.2.4
keras-contrib@git+https://github.com/keras-team/keras-contrib.git#egg=keras-contrib
en-core-web-sm@https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz#egg=en-core-web-sm
pip install -r requirements.txt
pip install -U dist/mypack-....whl

But this is not clean way, since a wheel should be self contained.

Thank you for any hint!


Environment

  • Python: 3.7.0
  • Pip: 20.2.4
  • setuptools: 50.3.2
madkote
  • 41
  • 4

1 Answers1

5

Some time ago it was possible to define a single requirements.txt or similar containing both specs for PyPI packages and links to repositories and archives.

That demanded to parse the requirements.txt and split them into "requirements" and "dependencies", where "requirements" would contain definitions of PyPI packages and "dependencies" — links.

Setuptools has different args for setup() for these: install_requires and dependency_links.

And it really worked: one was able to define a requirements.txt and install a package both as python setup.py install and as pip install .. Moreover, it was possible to install just dependencies via pip install -r requirements.txt. All ways worked and allowed to have a single place to define all requirements including non-PyPI links.

However, support of dependency_links arg was dropped by pip since v19. And here's the weird part: it is not dropped by setuptools. But there's more.

As of today, pip:

  • Supports only install_requires.
  • Prefers PEP 508 notation for dependencies in definitions of packages (install_requires) and standalone requirements.txt or similar.
  • Aborts installation of packages, which contain links in their install_requires.

Your definitions of dependencies mix 2 notations: prefixes like keras-contrib@ are from PEP 508 and #egg= parts are from setuptools links notation.

This is not an issue: pip will ignore "eggs" as names are already defined before @.

I believe the installation of the package via pip works fine, i.e.:

  pip install .

However, the issues will arise if the package is installed via setuptools, i.e.:

  python setup.py install

setuptools does not understand PEP 508 notation and ignores links in install_requires. As of today, to make setuptools following links, both install_requires and dependency_links have to be used, e.g.:

setup(
   ...
   install_requires=[
      ...
      "keras_contrib==2.0.8",
      ...
   ],
   dependency_links=[
      "https://github.com/keras-team/keras-contrib/tarball/master#egg=keras_contrib-2.0.8",
      ...
   ],
)

Here are several tricky points:

  • A single dependency is defined in 2 places: a package name in install_requires and a link in dependency_links to resolve the package dependency.
  • The link is not git+https://.../....git, but it's a link to an archive: https://.../tarball/....
  • Egg name is in snake_case, not in dash-case. While it's possible to use dash-case, this will not allow specifying the version.
  • Version in install_requires is delimited via == and in dependency_links — via -.
  • It's possible to omit the version. But the only viable use case for that is if the package is not present in PyPI and is rarely updated. If the package is present in PyPI, but an unpublished version is needed, then the version must be specified.

And here's the bummer: fixing links for setuptools will break pip, as PEP 508 does not allow to specify versions. Keeping keras-contrib==x.y.z @ ... in install_requires will make pip to search for the package keras-contrib==x.y.z, where ==x.y.z is not a version, but a part of the name. At the same time, not specifying a version will make setuptools to grab the latest version available at PyPI, not at the link from dependency_links.

In your case neither keras-contrib nor en-core-web-sm are present at PyPI, so using keras_contrib@git+https://... + dependency_links without version specified might work.

Otherwise, stick to pip install . and avoid using python setup.py install if the package depends on links.

See also:

Trivia: several issues on GitHub are still open and PEP 508 is still in Active state since 2015. Digging around source code would reveal that setuptools is a wrapper around Python's distutils. setuptools is not a part of Python's stdlib, but the docs of distutils imply stdlib docs will be removed after docs of setuptools will be updated. At the same time pip is already bundled with Python's installations as Python's module. And yet we have pipfiles, pipenv, poetry, conda, pipx, pip-tools, shiv, spack, and the rest. Looks a bit overwhelming.

oblalex
  • 5,366
  • 2
  • 24
  • 25