0

I am trying to upgrade a Pyramid project to Python 3. I am also exploring various on how to improve the build system so that is more modern.

Currently we are using Python's buildout to setup an instance. The approach is simple:

  1. All the required eggs (including app - which is my package) are stored in a buildout configuration file with their exact versions specified.
  2. Run buildout to get the exact eggs (including third party stuff) from a local package server.
  3. Run the instance using ./bin/pserve config.ini.

For my new app source code that has Python 3 changes, I am trying to get rid of everything and just use pip instead. This is what I have done now (in a docker container):

git clone git@github.com/org/app.git
cd project
# Our internal components are fetched using the `git` directive inside `requirements.txt`.
pip install -r requirements.txt  # Mostly from PyPi.
pip install .

It works, but is this the correct way to deploy an application for deployment?

Will I be able to convert the entire installation to a simple: pip install app and run it using pserve config.ini if I do the following:

  1. Upload the latest app egg to my package server.
  2. Sync setup.py and requirements.txt so that Python to do pip install -r requirements.txt (or its equivalent) internally?
  3. pip install app.
  4. Copy config.ini to the machine where I am going to install.
  5. Run pserver config.ini

I wanted to know if the above approach can be made to work before proceeding with the egg creation, mocking a simple package server etc. I am not sure if I can really do pip install for a web application; and I think requirements.txt has some significance in this case.

I haven't explored wheels yet, but if the above works, I will try that as well.

Since I am really new to packaging, would appreciate if I can some suggestions to modernize by build using the latest tools.


After reading some of the links like requirements.txt vs setup.py, I think requirements.txt is needed for Web Apps especially if you want a consistent behaviour for deployment purposes. A project or an application seems to be different than a library where pip install suffices.

If that is the case, I think the ideal way is to do pip install -r requirements.txt and then pip install app from a local package server without git cloning?

Resources: install_requires vs requirements files

Nishant
  • 20,354
  • 18
  • 69
  • 101
  • So... if you read this link, you know that if you run your app like `./bin/pserve config.ini` then you should use `requirements.txt`, right? – sanyassh Feb 13 '20 at 19:31
  • @sanyash, I am trying to run it like `pserver config.ini`; `bin/pserve` is a buildout mechanism that almost does the same thing but not quite same AFAIK. – Nishant Feb 13 '20 at 19:36
  • 1
    Anyway, you are doing `run app` in some sense, not `import app` in Python code. Thus said, using `pip install -r requirements.txt` in your dockerfile looks perfectly fine. – sanyassh Feb 13 '20 at 19:46
  • @sanyash Yeah, I wasn't really clear about what is suitable where -- the more I understand it, `pip install -r requirements` approach seems to be correct for a project compared to a library. I never knew the difference! – Nishant Feb 13 '20 at 19:50

0 Answers0