This answer (and the one above it) explains a way to install pip requirements on an offline machine by first involving a machine with internet:
If you want install python libs and their dependencies offline, finish following these steps on a machine with the same os, network connected, and python installed:
Create a
requirements.txt
file with similar content (Note - these are the libraries you wish to download):Flask==0.12 requests>=2.7.0 scikit-learn==0.19.1 numpy==1.14.3 pandas==0.22.0
One option for creating the requirements file is to use pip freeze > requirements.txt
. This will list all libraries in your environment. Then you can go in to requirements.txt
and remove un-needed ones.
Execute command
mkdir wheelhouse && pip download -r requirements.txt -d wheelhouse
to download libs and their dependencies to directorywheelhouse
Copy requirements.txt into
wheelhouse
directoryArchive wheelhouse into
wheelhouse.tar.gz
withtar -zcf wheelhouse.tar.gz wheelhouse
Then upload wheelhouse.tar.gz
to your target machine:
Execute
tar -zxf wheelhouse.tar.gz
to extract the filesExecute
pip install -r wheelhouse/requirements.txt --no-index --find-links wheelhouse
to install the libs and their dependencies
This is exactly what I'm doing, except my requirements.txt, for now, is just:
notebook==7.0.0a4
which is Jupyter Notebook.
But oddly, I'm getting the error:
ERROR: Could not find a version that satisfies the requirement pyzmq>=17 (from jupyter-server) (from versions: none)
ERROR: No matching distribution found for pyzmq>=17
I figured out how to make progress on this error:
- Adding
pyzmq==17
to my requirements.txt
But then the same error appears for a similar package, so it seems like I could just keep explicitly adding these packages to requirements.txt
but that seems a bit less than optimal, especially if there are a lot of packages to add. Is there something I could add to the wheel building command to get all these dependencies included without doing so manually?