I have an application that I'm developing on one server and running it on another. My question is about transferring this application to its destination server.
The application consist mainly of a large package with subpackages. There is also a small startup script, a sample config file and a systemd .service file. I have added setup.py
and MANIFEST.in
files to create a standard installable package.
This is my current workflow:
on the development server:
- do some work
git push
it to a local repository
on the production server:
- activate an virtual env
pip install --upgrade git+http://...
It does the job, but recently I have read this: Where in a virtualenv does *my* code go?. I'm afraid that I'm probably doing it the wrong way.
In general I do agree with this answer (shortened):
virtualenv
provides a python interpreter instance, not an application instance. .... For example, you might have a project where you have multiple applications using the same virtualenv.
Now, I want to be able to share the same virtual env between applications. I'd like to change my workflow, but I don't know how.
I'm not even sure what are application files that should stay outside of a virtual env? My whole application or just everything except the python package, i.e the scripts and config files?
Or should I use the virtual env for requirement packages only, skip the pip
and install my app with just git checkout
?