I believe using pip-compile
from pip-tools is a good practice when constructing your requirements.txt. This will make sure that builds are predictable and deterministic.
The pip-compile
command lets you compile a requirements.txt
file from your dependencies, specified in either setup.py
or requirements.in
Here's my recommended steps in constructing your requirements.txt (if using requirements.in):
- Create a virtual env and install pip-tools there
$ source /path/to/venv/bin/activate
(venv)$ python -m pip install pip-tools
- Specify your application/project's direct dependencies your requirements.in file:
# requirements.in
requests
boto3==1.16.51
- Use pip-compile to generate requirements.txt
$ pip-compile --output-file=- > requirements.txt
your requirements.txt files will have:
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile --output-file=-
#
boto3==1.16.51
# via -r requirements.in
botocore==1.19.51
# via
# boto3
# s3transfer
certifi==2020.12.5
# via requests
chardet==4.0.0
# via requests
idna==2.10
# via requests
jmespath==0.10.0
# via
# boto3
# botocore
python-dateutil==2.8.1
# via botocore
requests==2.25.1
# via -r requirements.in
s3transfer==0.3.3
# via boto3
six==1.15.0
# via python-dateutil
urllib3==1.26.2
# via
# botocore
# requests
Your application should always work with the dependencies installed from this generated requirements.txt
. If you have to update a dependency you just need to update the requirements.in
file and redo pip-compile
. I believe this is a much better approach than doing pip freeze > requirements.txt
which I see some people do.
I guess the main advantage of using this is you can keep track of the actual direct dependencies of your project in a separate requirement.in
file
I find this very similar to how node modules/dependencies are being managed in a node app project with the package.json
(requirements.in) and package-lock.json
(requirements.txt).