0

I'm a Java/Scala dev transitioning to Python for a work project. To dust off the cobwebs on the Python side of my brain, I wrote a webapp that acts as a front-end for Docker when doing local Docker work. I'm now working on packaging it up and, as such, am learning about setup.py and virtualenv. Coming from the JVM world, where dependencies aren't "installed" so much as downloaded to a repository and referenced when needed, the way pip handles things is a bit foreign. It seems like best practice for production Python work is to first create a virtual environment for your project, do your coding work, then package it up with setup.py.

My question is, what happens on the other end when someone needs to install what I've written? They too will have to create a virtual environment for the package but won't know how to set it up without inspecting the setup.py file to figure out what version of Python to use, etc. Is there a way for me to create a setup.py file that also creates the appropriate virtual environment as part of the install process? If not — or if that's considered a "no" as this respondent stated to this SO post — what is considered "best practice" in this situation?

Community
  • 1
  • 1
snerd
  • 1,238
  • 1
  • 14
  • 28
  • Are you trying to distribute your package or all you need is to get it running properly on the server? – taras Jan 13 '17 at 17:27
  • "what version of python to use" should be documented in the README. What do your users need to inspect `setup.py` for? – jwodder Jan 13 '17 at 17:49

2 Answers2

1

You can think of virtualenv as an isolation for every package you install using pip. It is a simple way to handle different versions of python and packages. For instance you have two projects which use same packages but different versions of them. So, by using virtualenv you can isolate those two projects and install different version of packages separately, not on your working system.

Now, let's say, you want work on a project with your friend. In order to have the same packages installed you have to share somehow what versions and which packages your project depends on. If you are delivering a reusable package (a library) then you need to distribute it and here where setup.py helps. You can learn more in Quick Start

However, if you work on a web site, all you need is to put libraries versions into a separate file. Best practice is to create separate requirements for tests, development and production. In order to see the format of the file - write pip freeze. You will be presented with a list of packages installed on the system (or in the virtualenv) right now. Put it into the file and you can install it later on another pc, with completely clear virtualenv using pip install -r development.txt

And one more thing, please do not put strict versions of packages like pip freeze shows, most of time you want >= at least X.X version. And good news here is that pip handles dependencies by its own. It means you do not have to put dependent packages there, pip will sort it out.

Talking about deploy, you may want to check tox, a tool for managing virtualenvs. It helps a lot with deploy.

taras
  • 3,579
  • 3
  • 26
  • 27
  • 1
    is there a tool that will inspect the contents of setup.py and create the appropriate virtual env as part of the setup process or is the end user expected to first inspect the README and/or setup.py and setup the virtual environment manually? – snerd Jan 14 '17 at 04:39
  • 1
    You can think of `setup.py` as a package manifest. It declares attributes of the package and its dependencies. So in the end you have a single `library` which can be distributed and installed and reused in any applications. – taras Jan 14 '17 at 09:19
  • 1
    virtualenv has to be created manually, however packages that have to installed can be installed automatically into the created virtualenv by `pip install -r list_of_packages_to_install.txt` - `r` stands for recursive installation. – taras Jan 14 '17 at 09:21
  • that answers my question -- t/y ! – snerd Jan 14 '17 at 20:00
-1

Python default package path always point to system environment, that need Administrator access to install. Virtualenv able to localised the installation to an isolated environment.

For deployment/distribution of package, you can choose to

  1. Distribute by source code. User need to run python setup.py --install, or
  2. Pack your python package and upload to Pypi or custom Devpi. So the user can simply use pip install <yourpackage>

However, as you notice the issue on top : without virtualenv, they user need administrator access to install any python package.

In addition, the Pypi package worlds contains a certain amount of badly tested package that doesn't work out of the box.

Note : virtualenv itself is actually a hack to achieve isolation.

Community
  • 1
  • 1
mootmoot
  • 12,845
  • 5
  • 47
  • 44
  • interesting article -- especially since the author states at the end that he doesn't use pip/virtualenv, etc for development. it seems like in doing so, he's got to either use some other means of virtualization or deal with the fact that he can't have some projects coexist on the same box without running into wonky whatnot. – snerd Jan 14 '17 at 04:43
  • @DavidHoliday Actually, all language suffer similar issue during continuous-integration deployment. Default function structure changes; explicit external libraries requirement, etc will break the integration loop. Even tools like docker only mitigate the issues, it doesn't solve it. – mootmoot Jan 16 '17 at 08:53