I'm looking for others that have experience with python venv and deployment of applications via rpm. My goals are to:
- Use pip for python dependency
- Use venv to keep application environments/deps separate
- Use rpm for deployment (required by our companies internal audit etc).
I have a build server (jenkins slave) for each architecture (read: distro) that we deploy on. My original (and only) plan on the jenkins slave(build server) via a jenkins job was to:
- Create a venv
- Activate venv
- python setup.py build/install (within rpm spec)
- archive rpm as artifact
- Rejoice
I never got to step 2 or 3 so I do not know the dragons there, however the main issue comes with "Create a venv" step. Since venvs are not "relocatable" and RPM uses a RPM_BUILD_ROOT, which is a self contained filesystem in a tmpdir that we package from, I can't install the venv into the rpm_build_root. I'd have to install the venv into the ACTUAL location on the build server it was going to be when we deploy (install rpm). This is not ideal for a lot of reasons you may be able to guess (collisions with other applications, other stuff running on build servers etc).
I don't want to run setup.py on my production box and download packages at install time. I want to make sure all is well, have everything downloaded and packaged before deploy happens.
The closest thing I have found is dh-virtualenv from this so question. This looks promising and from what I can tell installs directly into the final directory (not a temp build). It cleans up after itself but still seems bad practice. Is there a better way? Am I missing something? Seems I'm stuck doing it the spotify way.