7

I have a bunch of machines that are isolated from the internet, and only have access to some services on the local network.

I want the users using those machines to be able to search for and install whatever python libraries they want from a local pypi server. I therefore created a global pip.conf under /etc/ which contains the following lines:

[global]
trusted-host = my.pypi-server.com
index-url = https://my.pypi-server.com/

This works when the name of the library is known and you just run pip install fancy-lib. However, when searching for a package, pip seems to ignore index-url:

$  pip search fancy-lib -vvvv
Starting new HTTPS connection (1): pypi.python.org

$  pip install fancy-lib -vvvv
Collecting fancy-lib
  1 location(s) to search for versions of fancy-lib:
  * https://my.pypi-server.com/simple/fancy-lib
  Getting page https://my.pypi-server.com/simple/fancy-lib
  Looking up "https://my.pypi-server.com/simple/fancy-lib" in the cache
  No cache entry available
  Starting new HTTPS connection (1): https://my.pypi-server.com/

How can I make pip search work with my local pypi server?

rasebo
  • 957
  • 1
  • 13
  • 21

1 Answers1

9

It seems it was just a matter of RTM. The search index is independent from the install one, and is specified using index:

[global]
trusted-host = my.pypi-server.com
index = https://my.pypi-server.com/
index-url = https://my.pypi-server.com/

An alternative to this config file would be:

[global]
trusted-host = my.pypi-server.com

[search]
index = https://my.pypi-server.com/

[install]
index-url = https://my.pypi-server.com/

This is not really intuitive, and there is already an enhancement request open to address it: https://github.com/pypa/pip/issues/4263

Funk Forty Niner
  • 74,450
  • 15
  • 68
  • 141
rasebo
  • 957
  • 1
  • 13
  • 21