241

How do I upgrade all my python packages from requirements.txt file using pip command?

tried with below command

$ pip install --upgrade -r requirements.txt

Since, the python packages are suffixed with the version number (Django==1.5.1) they don't seem to upgrade. Is there any better approach than manually editing requirements.txt file?

EDIT

As Andy mentioned in his answer packages are pinned to a specific version, hence it is not possible to upgrade packages through pip command.

But, we can achieve this with pip-tools using the following command.

$ pip-review --auto

this will automatically upgrade all packages from requirements.txt (make sure to install pip-tools using pip install command).

Bob Stein
  • 16,271
  • 10
  • 88
  • 101
abhiomkar
  • 4,548
  • 7
  • 29
  • 24

17 Answers17

224

I already answered this question here. Here's my solution:

Because there was no easy way for upgrading package by package, and updating the requirements.txt file, I wrote this pip-upgrader which also updates the versions in your requirements.txt file for the packages chosen (or all packages).

Installation

pip install pip-upgrader

Usage

Activate your virtualenv (important, because it will also install the new versions of upgraded packages in current virtualenv).

cd into your project directory, then run:

pip-upgrade

Advanced usage

If the requirements are placed in a non-standard location, send them as arguments:

pip-upgrade path/to/requirements.txt

If you already know what package you want to upgrade, simply send them as arguments:

pip-upgrade -p django -p celery -p dateutil

If you need to upgrade to pre-release / post-release version, add --prerelease argument to your command.

Full disclosure: I wrote this package.

Community
  • 1
  • 1
Simion Agavriloaei
  • 3,281
  • 2
  • 17
  • 10
  • 5
    Very useful tool! I also found another package called [pur](https://github.com/alanhamlett/pip-update-requirements) that upgrades the pinned versions as well – Gal Avineri Aug 18 '19 at 20:31
  • 1
    Awesome! It even gave me a warning when I hadn't activated my venv. It also gives you the option of which packages to upgrade after it finds all the new versions. Excellent work! – Robert Rendell Aug 30 '20 at 22:59
  • 1
    conda handles all these updating issues, and guarantees entire environment integrity. pip is greedy and selfish and only installs or updates (or downgrades) what it needs to install current packages, sometimes downgrading core packages and breaking things. Conda handles package and environment management at once. Upgrading pip AND venv OR virtualenv (OR pip-whatever is one more unnecessary headache. I use pip as the last resort in conda environment when conda packages are not available. This pip upgrader package makes it easier to tolerate pip when conda packages are not available. – Rich Lysakowski PhD Jul 19 '21 at 05:19
  • 3
    Pip upgrader has been discontinued per its github page. – Oliver Jun 09 '22 at 14:16
  • 1
    I tried `pur` from the comment above and can confirm it is awesome. – dee Oct 18 '22 at 14:27
  • It gives `ModuleNotFoundError: No module named 'pip._vendor.urllib3.util.wait'` error during `pip-upgrade` – alper Apr 25 '23 at 09:58
  • Ah you are right pip-upgrader is not maintained anymore. – Melroy van den Berg Aug 02 '23 at 13:34
143

you can try:

pip install --upgrade --force-reinstall -r requirements.txt

You can also ignore installed package and install the new one :

pip install --ignore-installed -r requirements.txt
l0b0
  • 55,365
  • 30
  • 138
  • 223
Freelancer
  • 4,459
  • 2
  • 22
  • 28
  • 7
    with that option it seems to reinstall the same version. As Andy mentioned in above answer, packages are pinned to specific version. – abhiomkar Jul 15 '14 at 17:41
  • @abhiomkar you're rigth I thought you wanted to re install the same version (maybe to add backport fix) – Freelancer Jul 15 '14 at 17:44
  • 3
    The second command is what I was looking for. Notice that `-I` and `--ignore-installed` are the same flags and of-course it's not valid to have a `,` in there. This way no downgrades will occur during install and after installation of requirements is complete one can upgrade installed packages using `pip-review --auto`. – AXO Mar 26 '17 at 09:00
  • 4
    This is definitely the best solution, as it uses pip drrectly without having to install yet another package. – Zephaniah Grunschlag Jan 17 '21 at 22:04
  • It didn't work, I still have old version. It uninstalled the old version, then installed the old version again. – AnonymousUser Nov 12 '21 at 04:44
76

No. Your requirements file has been pinned to specific versions. If your requirements are set to that version, you should not be trying to upgrade beyond those versions. If you need to upgrade, then you need to switch to unpinned versions in your requirements file.

Example:

lxml>=2.2.0

This would upgrade lxml to any version newer than 2.2.0

lxml>=2.2.0,<2.3.0

This would upgrade lxml to the most recent version between 2.2.0 and 2.3.0.

Andy
  • 49,085
  • 60
  • 166
  • 233
  • 5
    I found helpful to do the following. 1. Deleted venv 2. Created a new one with the same name (the way to clean all pip packages) 3. Replace all == to >= in the requirements.txt 4. pip install -r requirements.txt 4. – zhukovgreen Jul 28 '17 at 05:17
  • 4
    `sed 's/==/>=/g' requirements.txt > TMP_FILE && mv TMP_FILE requirements.txt ` will replace all `==` with `>=` – philshem Nov 18 '19 at 09:58
29

I suggest freezing all of your dependencies in order to have predictable builds.

When doing that, you can update all dependencies at once like this:

sed -i '' 's/[~=]=/>=/' requirements.txt
pip install -U -r requirements.txt
pip freeze | sed 's/==/~=/' > requirements.txt

Having done the above, test your project with the new set of packages and eventually commit the requirements.txt file to the repository while still allowing for installing hot-fixes.

Hermes
  • 756
  • 7
  • 10
  • 1
    that's all good. So after a few months packages will have updates, how do you update those and again commit the .txt file? – vidstige Nov 23 '20 at 10:54
  • I updated my post so that it would better depict my approach. Assuming that the app is alive and actively developed, some changes are made to it from time to time. At some of these occasions its dependencies can be manually updated using the above approach. This may require some extra changes as there may be some incompatibilities. Other than that, the changes always go through CI/CD during which at least some hot-fixes can be applied thanks to `~=` in the `requirements.txt`. Since with `~=` no significant and breaking changes are to be expected, the builds can still be considered predictable. – Hermes May 31 '21 at 20:05
11

Another solution is to use the upgrade-requirements package

pip install upgrade-requirements

And then run :

upgrade-requirements

It will upgrade all the packages that are not at their latest versions, and also create an updated requirements.txt at the end.

dmdip
  • 1,665
  • 14
  • 15
10

Fixing dependencies to a specific version is the recommended practice.

Here's another solution using pur to keep the dependencies fresh!

Give pur your requirements.txt file and it will auto update all your high-level packages to the latest versions, keeping your original formatting and comments in-place.

For example, running pur on the example requirements.txt updates the packages to the currently available latest versions:

$ pur -r requirements.txt
Updated flask: 0.9 -> 0.10.1
Updated sqlalchemy: 0.9.10 -> 1.0.12
Updated alembic: 0.8.4 -> 0.8.6
All requirements up-to-date.

As pur never modifies your environment or installed packages, it's extremely fast and you can safely run it without fear of corrupting your local virtual environment. Pur separates updating your requirements.txt file from installing the updates. So you can use pur, then install the updates in separate steps.

Vishal Kharde
  • 1,553
  • 3
  • 16
  • 34
7

I've just had to do the same... used this small one-liner to do the job:

packages=$(cat requirements.txt | sed 's/==.*//g'); echo $packages | xargs pip3 install -U; freeze=$(pip3 freeze); for p in $(echo $packages); do echo $freeze | grep -E "^${p}==" >> requirements.new; done

which:

  • packages=$(cat requirements.txt | sed 's/==.*//g') creates a list of the current packages names in requirements.txt (removing the version).
  • echo $packages | xargs pip3 install -U then passes all of the packages as arguments to pip3 to upgrade.
  • freeze=$(pip3 freeze); Gets all of the current package versions in the format required for requirements.txt
  • for p in $(echo $packages) then iterates through the package names
    • echo $freeze | grep -E "^${p}==" >> requirements.new gets the package version line from the pip freeze output which matches the package and writes to new requirements.txt

This has the added benefit of preserving the ordering of the original requirements.txt. :)

Hope this helps!

MatthewJohn
  • 71
  • 1
  • 1
7

The second answer is the most useful but what I wanted to do is lock some packages while having others at the latest version (e.g. youtube-dl).

An example requirements.txt would look like this (~ means compatible):

Pillow==6.2.2
requests~=2.22.0
youtube_dl

Then in the terminal, use the command pip install --upgrade -r requirements.txt

This ensures that Pillow will stay at 6.2.2, requests will be upgraded to the latest 2.22.x (if available), and the latest version of youtube-dl will be installed if not already.

Elijah
  • 1,814
  • 21
  • 27
4

Since I couldn't do that using bash, I wrote a python module to create a new requirements file with no versions and use it:

data = open('requirements-prod.pip', 'r')
data2 = open('requirements-prod-no-version.pip', 'w')
for line in data.readlines():
    new_line = line[:line.index('==')]
    data2.write(new_line + '\n')
data2.flush()

Then install the libs from the new file pip install -U -r requirements-prod-no-version.pip

Finally freeze the versions to the original file pip freeze > requirements-prod.pip

Montaro
  • 9,240
  • 6
  • 29
  • 30
4

More robust solution is IMO to use a dependency management such as poetry, https://python-poetry.org which comes with an exhaustive dependency resolver.

giotto
  • 532
  • 3
  • 15
2

I guess the simplest solution is creating the requirements.txt with:

pip freeze | sed 's/==/>=/' > requirements.txt
JUNPA
  • 220
  • 2
  • 6
2

You can use below command on Linux and Mac:

cat requirements.txt | cut -f1 -d= | xargs pip install -U
darw
  • 941
  • 12
  • 15
1

No 3rd-party dependencies. It's Asynchronous and Fast.

import asyncio, argparse, json, re, io
from collections import namedtuple
from typing import Iterable
import urllib.request

Package = namedtuple("Package", ["name", "version", "raw"])
placeholder = "{name:<30} {old_ver:10} {new_ver:<10}"


async def get_pypi_latest_version(package: Package) -> tuple[Package, Package]:
    base_package, *_ = package.name.split("[")
    with urllib.request.urlopen(f"https://pypi.org/pypi/{base_package}/json") as f:
        data = json.loads(f.read())
        version = data["info"]["version"]
    latest_package = Package(package.name, version, f"{package.name}=={version}")
    return package, latest_package


async def fetch_all_latest_packages(
    packages: Iterable[Package],
) -> list[tuple[Package, Package]]:
    coro = (get_pypi_latest_version(package) for package in packages)
    new_packages = await asyncio.gather(*coro)
    return [(old, new) for old, new in new_packages if old.version != new.version]


def read_packages(requirements_io: io.TextIOWrapper):
    for line in requirements_io:
        raw = line.strip()
        package_name, *_, version = re.split(r"<|=|>|\[\]", raw)
        yield Package(package_name, version, raw)
    requirements_io.close()


if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description="Check for updated dependencies in requirements file."
    )
    parser.add_argument(
        "filename",
        type=argparse.FileType("r"),
        help="requirement file",
    )
    parser.add_argument(
        "-u",
        "--update",
        action="store_true",
        help="also update requirement file with new versions.",
    )
    args = parser.parse_args()
    current_packages = list(read_packages(args.filename))
    latest_packages = asyncio.run(fetch_all_latest_packages(current_packages))
    if not latest_packages:
        print("Everything upto date.")
        exit(0)
    print(placeholder.format(name="NAME", old_ver="OLD_VER", new_ver="NEW_VER"))
    for old, new in latest_packages:
        if args.update:
            current_packages[current_packages.index(old)] = new
        print(
            placeholder.format(name=new.name, old_ver=old.version, new_ver=new.version)
        )
    if args.update:
        with open(args.filename.name, mode="w", encoding="utf_8", newline="\n") as f:
            f.writelines([raw + "\n" for *_, raw in current_packages])

$ python check_updates.py requirements.txt
NAME            OLD_VER         NEW_VER
Django          4.2.1           4.2.2
django-filter   23.1            23.2
psycopg         3.1.1           3.1.9


$ python check_updates.py -h
usage: check_updates.py [-h] [-u] filename

Check for updated dependencies in requirements file.

positional arguments:
  filename      requirement file

options:
  -h, --help    show this help message and exit
  -u, --update  also update requirement file with new versions.
Gaurov Soni
  • 111
  • 3
0
  • 1) To upgrade pip installed files from reqs.txt add the >= in replacement of == this will tell pip to install lib greater than or equal to the version you are requesting, here by installing the most to-date version of requested library

    1.a) **My answer for thread ** By adding py -m pip install -r reqs.txt to a daily restart... or something of the nature you can update your installed libs. Summed up by Andy Perfectly

    -My reason For entering this thread was to find information on how to update virtual env base pip (usually 10.0.03 for me??)

in-hopes of satisfying an issue of which have I was able to derive one of two solutions

A. creation of venv || B. Installation of Required libs

Thanks to Andy I have satisfied need B

By adding pip >= requested version in reqs.txt

upon instantiation of new virtual-Environment || re-instantiation of previous Venv

  1. py -m venv devenv

to setup new dev env

  1. devenv\scripts\activate.bat

to activate dev env

  1. python -m pip install -r requirenments.txt

to install base libs

yeilds output

Collecting pip >= 20.0.2 (from -r requirenments.txt (line 1)) Using cached >https://files.pythonhosted.org/packages/54/0c/d01aa759fdc501a58f431eb594a17495f15b88da142ce14b5845662c13f3/pip-20.0.2-py2.py3-none-any.whl

Found existing installation: pip 10.0.1

Uninstalling pip-10.0.1:

 Successfully uninstalled pip-10.0.1
 Successfully installed pip-20.0.2

Sorry for the Brain Dump, Hopes this helps someone :)

Austin ‍

Chameera Dulanga
  • 218
  • 5
  • 17
-1

If you install anything in your django project and after installation you want to update your requirement file this command can update you requirement.txt file pip freeze > requirements.txt

if your requirement file not exist in you project you can use this command for make new requirement.txt file pip freeze > requirements.txt

  • this is a bad idea becuase it will also add the requirements of all your requirements, resulting in an unnecessarily large requirements file. If you then decide to update a single dependency, you are likely to get version conflicts, unless you know which other requirements were added to your requirements file because of that package. – rioted Mar 13 '20 at 13:57
-1

With pip-tools you have a basic requirements.in with desired dependencies and a requirements.txt file with pinned versions. pip-tools then generates the pinned versions automatically, which makes handling the whole process including upgrading your dependencies a lot easier.

# requirements.in
django

and the autogenerated requirements.txt (to pin all dependencies)

$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile requirements.in
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via -r requirements.in
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

If you use that workflow, which I can highly recommend, it's

pip-compile --upgrade

which generates the requirements.txt with the latest versions.

Karl Lorey
  • 1,536
  • 17
  • 21
-19

I edit the requirements.txt as below and run $sh ./requirements.txt

pip install -U amqp;
pip install -U appdirs;
pip install -U arrow;
pip install -U Babel;
pip install -U billiard;
pip install -U celery;
pip install -U Django;
pip install -U django-cors-headers;
pip install -U django-crispy-forms;
pip install -U django-filter;
pip install -U django-markdown-deux;
pip install -U django-pagedown;
pip install -U django-timezone-field;
pip install -U djangorestframework;
pip install -U fcm-django;
pip install -U flower;
pip install -U gunicorn;
pip install -U kombu;
pip install -U Markdown;
pip install -U markdown2;
pip install -U packaging;
Santhosh
  • 9,965
  • 20
  • 103
  • 243
  • 11
    This is ideal example of the anti-pattern. What's wrong here: 1) The requirements.txt is a .txt file, but you've made it executable 2). There is a simple pip install -r requirements.txt command. So you can use requirements.txt file only for listing your project dependencies. 3) You're not using versions of the packages. 4) Not a common pattern, other developers do not nothing about how to use it. 5) Hard to use with CI/CD pipelines. Please don't use this example – fanni Dec 14 '19 at 14:21