I need to deploy a python script that requires a mix of public (pip installable) and "local" (pip install path/to/local/package) packages. The computer I'm deploying to will not have internet access or access to the local packages, but likely will have python. Both preparation and deployment will be done in windows.
My solution so far is to create a venv:
python -m venv myvenv
- activate myvenv
pip install [packages available from pip]
pip install [local packages]
Then I have two options:
1) Ship the venv directory
- ship the venv directory with my python script (so someone could call the script with something like
myvenv\Scripts\python myscript.py
2) Ship the downloaded dependencies
- From myvenv create requirements.txt (
pip freeze > requirements.txt
), which captures any packages I downloaded and any dependencies pip downloaded for them - Download the whl/zip files for the dependencies
pip download -r requirements.txt -d local_package_files --find-links=path/to/local/package/zip
- Ship the local_package_files directory with myscript.py, then on the computer I'm deploying to do something like:
python -m venv deployedenv
- activate deployedenv
pip install -r requirements.txt --no-index --find-links=local_package_files
I can't put my finger on it, but something feels a bit fragile with these solutions. If I do (1) I think it works so long as I'm always transferring between like OS's (eg: from windows to windows). With (2), I'm not sure it is stable if the destination computer has a different python version than the one downloading everything.
I've tried looking through other SO posts but they were usually missing one element (eg: packages dependencies but still needs internet connectivity, or packages dependencies from pip but doesn't handle local packages, etc...).
Suggestions on how to improve/streamline, or where this will break are appreciated.