1

This answer (and the one above it) explains a way to install pip requirements on an offline machine by first involving a machine with internet:


If you want install python libs and their dependencies offline, finish following these steps on a machine with the same os, network connected, and python installed:

  1. Create a requirements.txt file with similar content (Note - these are the libraries you wish to download):

    Flask==0.12 requests>=2.7.0 scikit-learn==0.19.1 numpy==1.14.3 pandas==0.22.0

One option for creating the requirements file is to use pip freeze > requirements.txt. This will list all libraries in your environment. Then you can go in to requirements.txt and remove un-needed ones.

  1. Execute command mkdir wheelhouse && pip download -r requirements.txt -d wheelhouse to download libs and their dependencies to directory wheelhouse

  2. Copy requirements.txt into wheelhouse directory

  3. Archive wheelhouse into wheelhouse.tar.gz with tar -zcf wheelhouse.tar.gz wheelhouse

Then upload wheelhouse.tar.gz to your target machine:

  1. Execute tar -zxf wheelhouse.tar.gz to extract the files

  2. Execute pip install -r wheelhouse/requirements.txt --no-index --find-links wheelhouse to install the libs and their dependencies


This is exactly what I'm doing, except my requirements.txt, for now, is just:

notebook==7.0.0a4

which is Jupyter Notebook.

But oddly, I'm getting the error:

ERROR: Could not find a version that satisfies the requirement pyzmq>=17 (from jupyter-server) (from versions: none)                                      
ERROR: No matching distribution found for pyzmq>=17

I figured out how to make progress on this error:

  • Adding pyzmq==17 to my requirements.txt

But then the same error appears for a similar package, so it seems like I could just keep explicitly adding these packages to requirements.txt but that seems a bit less than optimal, especially if there are a lot of packages to add. Is there something I could add to the wheel building command to get all these dependencies included without doing so manually?

J.Todd
  • 707
  • 1
  • 12
  • 34

1 Answers1

1

Sounds like you've followed step 1, one and manually created the requirements.txt with the contents notebook==7.0.0a4, but missed the part where it say's:

One option for creating the requirements file is to use pip freeze > requirements.txt. This will list all libraries in your environment. Then you can go in to requirements.txt and remove un-needed ones.

If I manually install notebook, on the "internet connected" computer (probably best use a fresh virtual env to prevent other project's deps leaking in) with:

pip install notebook==7.0.0a4

Then export the requirements with:

pip freeze > requirements.txt

This gives me a requirements.txt file containing:

aiofiles==0.8.0
aiosqlite==0.17.0
anyio==3.6.1
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
asttokens==2.0.5
attrs==21.4.0
Babel==2.10.1
backcall==0.2.0
beautifulsoup4==4.11.1
bleach==5.0.0
certifi==2022.5.18.1
cffi==1.15.0
charset-normalizer==2.0.12
debugpy==1.6.0
decorator==5.1.1
defusedxml==0.7.1
entrypoints==0.4
executing==0.8.3
fastjsonschema==2.15.3
idna==3.3
importlib-metadata==4.11.4
importlib-resources==5.7.1
ipykernel==6.13.1
ipython==8.4.0
jedi==0.18.1
Jinja2==3.1.2
json5==0.9.8
jsonschema==4.6.0
jupyter-client==7.3.4
jupyter-core==4.10.0
jupyter-server==1.17.1
jupyter-ydoc==0.1.10
jupyterlab==4.0.0a26
jupyterlab-pygments==0.2.2
jupyterlab-server==2.14.0
MarkupSafe==2.1.1
matplotlib-inline==0.1.3
mistune==0.8.4
nbclient==0.6.4
nbconvert==6.5.0
nbformat==5.4.0
nest-asyncio==1.5.5
notebook==7.0.0a4
notebook-shim==0.1.0
packaging==21.3
pandocfilters==1.5.0
parso==0.8.3
pexpect==4.8.0
pickleshare==0.7.5
prometheus-client==0.14.1
prompt-toolkit==3.0.29
psutil==5.9.1
ptyprocess==0.7.0
pure-eval==0.2.2
pycparser==2.21
Pygments==2.12.0
pyparsing==3.0.9
pyrsistent==0.18.1
python-dateutil==2.8.2
pytz==2022.1
pyzmq==23.1.0
requests==2.28.0
Send2Trash==1.8.0
six==1.16.0
sniffio==1.2.0
soupsieve==2.3.2.post1
stack-data==0.2.0
terminado==0.15.0
tinycss2==1.1.1
tornado==6.1
traitlets==5.2.2.post1
typing-extensions==4.2.0
urllib3==1.26.9
wcwidth==0.2.5
webencodings==0.5.1
websocket-client==1.3.2
y-py==0.5.0
ypy-websocket==0.1.13
zipp==3.8.0

I think this is how to make the requirements file, which you then feed into step 2 meaning all the required deps go into the wheelhouse for export to the "offline computer".

v25
  • 7,096
  • 2
  • 20
  • 36
  • I will try this and it may work, but first, a comment on the top answer (same exact thing except doesn't mention the pip freeze thing), someone comments: "problem with this is that the dependencies might have other dependencies and those won't be downloaded." and the highly upvoted response was: 'Not true, "pip install --download" also downloads dependencies, so the above commands will work correctly even if your requirements have additional dependencies.' – J.Todd Jun 11 '22 at 15:57
  • And the reason this solution is unfortunate is this: It will get all the dependencies on your system. So he says go through the list and delete unneeded ones. So then you need to basically figure out which ones are sub dependencies or sub dependencies of sub dependencies. Would venv fix this? If I use pip freeze in a venv, will I just get the relevant deps? I haven't gotten around to trying venv yet – J.Todd Jun 11 '22 at 15:59
  • Yes, I checked, venv solves this dilemma but you need the `-l` option as in `pip freeze -l > requirements.txt ` – J.Todd Jun 11 '22 at 16:45
  • I wanted to add after trying this, it just doesnt work. All the requirements never seem to get pulled with a more complexpackaged. downloading a single package at once with pip download works – J.Todd Jun 12 '22 at 20:07