138

In my application I would like to use:

  • packageA, which requires packageX==1.3
  • packageB, which requires packageX==1.4
  • packageX==1.5

How can I install multiple versions of packageX with pip to handle this situation?

ᄂ ᄀ
  • 5,669
  • 6
  • 43
  • 57
limboy
  • 3,879
  • 7
  • 37
  • 53
  • 3
    install packages to non-standard locations, then use PYTHONPATH to determine which one to import; see [this answer](https://stackoverflow.com/a/19404371/4115369) – Yibo Yang Jun 25 '17 at 15:44
  • 3
    On a side note, the scenario in this question, and the hacky solution in the [currently accepted answer](https://stackoverflow.com/a/6572017) are a good example of why a package owner should not and must not be picky to pin an exact version of their dependency. They should have been using `packageX>=1.3,<2` and `packageX>=1.4,<2` (assuming the upstream packageX is following [Semantic Versioning](http://semver.org)), then you as the downstream app developer would have no problem to pin and use `packageX==1.5` and everyone would be happy. [More details here](https://stackoverflow.com/a/53718957) – RayLuo Jul 23 '19 at 16:37
  • Hmm. Maybe the new brackets for optional packages `main_package[opt]` could help – Kermit Jan 30 '21 at 16:07
  • @rayluo, true. However, then you need to QA/test against all those versions...or just pray. – Robert Lugg Aug 03 '21 at 19:53
  • @RobertLugg, there is a difference between a mid-tier library and a final application. As a library, `packageA` or `packageB` should not pin a specific version for its dependency `packageX`, regardless of whether it has tested all the `packageX` versions (and it certain can not test future versions of `packageX==1.6`, `1.7`, etc.). It is the job for application `APP`'s developer to pin a particular combination of `A`, `B` and `X`, after testing. If `packageA` and `B` pin `X` like the way in this question, `APP`'s developer does not even have a chance to test. Pray won't even help them. – RayLuo Aug 03 '21 at 21:18
  • @rayluo, when you, the owner of a package, declare the dependency (packageX) and you declare versions of that package (lets say 1.6-1.8) you are making a promise that regardless of what version of packageX the user picks, your package will work. The only way for you to know it works is to test against each. I'm not saying that package owners do this but if they want to be sure their package runs, they have to. Its painful, for sure. That's why the conservative package owners would only specify the range of tested dependency versions. Neither way is without its problems. – Robert Lugg Aug 04 '21 at 19:38
  • @rayluo, I have to concede. A package owner could lock the dependencies very tightly and be absolutely sure. But that would require them to re-test and re-publish any time any of their dependencies updates. I agree that would be crazy. Thanks for the link and discussion. – Robert Lugg Aug 05 '21 at 06:26

5 Answers5

95

pip won't help you with this.

You can tell it to install a specific version, but it will override the other one. On the other hand, using two virtualenvs will let you install both versions on the same machine, but not use them at the same time.

You best bet is to install both version manually, by putting them in your Python path with a different name.

But if your two libs expect them to have the same name (and they should), you will have to modify them so they pick up the version they need with some import alias such as:

import dependencyname_version as dependencyname

There is currently no clean way to do this. The best you can hope is for this hack to work.

I'd rather ditch one of the two libs and replace it with an equivalent, or patch it to accept the new version of the dependency and give the patch back to the community.

Jeremy
  • 1
  • 85
  • 340
  • 366
Bite code
  • 578,959
  • 113
  • 301
  • 329
  • 12
    So many languages have this problem...Java, Go, Haxe...hopefully language designers will abandon global package namespaces and adopt a more Node.js-like module system in future languages they create, since Node.js supports multiple versions of dependencies. – Andy May 30 '20 at 15:41
  • 7
    With maven or gradle you can install two versions of the same package, with pip you cant. Still you cant use two versions of the same package in the same program. – stripthesoul Jan 29 '21 at 13:53
  • 8
    @David how stupid is this that the package manager uses versioning, but completely ignores it when you want to install multiple of them LOL – t3chb0t Aug 28 '22 at 08:22
  • Ironic so many languages preach no globals, premature optimization is the root of all evil, more namespaces, etc. then have global named dependencies per binary. Why shouldn’t each piece be allowed to use its own version of things? developer ease >> deployment size, especially for python. – SwimBikeRun Dec 25 '22 at 15:52
  • @Andy Java is not having this problem. At least for the last 20+ years. Learn about Apache Ivy, Apache Maven, Gradle.. – ᄂ ᄀ May 21 '23 at 13:38
  • @stripthesoul is Maven able to set up some kind of classloader tricks so that the FQNs of classes in two versions of the same package don't conflict? – Andy May 22 '23 at 18:40
17

Download the source for ea. package. Install each on its own separate folder. For example. I had version 1.10 package, but wanted to switch to the dev version for some work. I downloaded the source for the dev module: git clone https://github.com/networkx/networkx.git cd netwokrx I created a folder for this version: mkdir /home/username/opt/python, then I set the PYTHONPATH env var to: export PYTHONPATH=/home/username/opt/python/lib/python2.7/site-packages/. Next, I installed it using: python setup.py install --prefix=/home/username/opt/python

Now, since my PYTHONPATH is now pointing to this other site-packages folder, when I run python on the command line, and import the new module, it works. To switch switch back, remove the new folder from PYTHONPATH.

>>> import networkx as nx
>>> nx.__version__
'2.0.dev_20151209221101'
Asclepius
  • 57,944
  • 17
  • 167
  • 143
sAguinaga
  • 638
  • 13
  • 31
  • 2
    Hah, came here specifically because I'm fighting `networkx` version differences! – dwanderson Apr 11 '19 at 23:24
  • 1
    "To switch switch back, remove the new folder from PYTHONPATH.", this part not working for me even after removing the new folder path from `sys.path` – Avinash Raj Jul 04 '19 at 06:50
3

an ugly workaround I use with python in blender is I'll install (and keep off path) a like version of python and use subprocess to have the other version do the needed work. Blenders python tends to get a little temperamental if you do much more than install pandas and scipy. I've tried this using virtualenvs with blender but that tends to break things.

Also on the off chance you are using blender for data visualization, you are going to want to add a config folder to your version number folder, this will keep all of your addons in that folder and it makes it far more portable and far less likely to mess up other installs of blender. Many people who make addons for blender are not 'programmers', so often those savvy people will do some very hackish things and this has been the best workaround I've been able to use.

Another workaround (and this has so many flags on the play that it should disqualify me from touching a keyboard) is to manually locate the init file and manually add it to globals with importlib ... this comes with risks. Some modules will play alright when you do this, other modules will toss crapfits that can lead to extra special troubleshooting sessions. Keep it to like versions and it does cut down on the issues and I've had 'alright' luck with using this to import modulus from behind virtual envs, but there is a reason why I use subprocess calls when working with blender's python.

def importfromfilelocation(x,y,z):
    #"""x = 'tk',y = "tkinter", z =r'C:\pyth"""
    mod_alis = x
    spec = importlib.util.spec_from_file_location(y, z)
    print(spec)
    mod_alis = importlib.util.module_from_spec(spec)
    spec.loader.exec_module(mod_alis)
    globals()[str(x)]= mod_alis
big meanie
  • 31
  • 1
  • 3
0

Every time I need to update an old python library I am doing the following:

  1. Cloning the library I need to update (usually from GitHub)
  2. Running git checkout to the version I am currently using
  3. Packaging the old version with a different name in some package repository (usually Gemfury for me)
  4. Installing the old version with the new name I gave it
  5. Update the package to the new version I would like to use
  6. Create a fall back if the new version fails - fall back to old version

The code will look something like this:

import ujson      # ujson version 5.7.0
import ujson_old  # ujson version v1.34

ujson_dumps(some_obj: Any) -> str:

  try:
    return ujson.dumps(some_obj)
  except ValueError:
    logging.error('Error parsing with new ujson version')

  return ujson_old(some_obj)

I recommend to read this blog post about how ujson was upgraded from a very old version to latest

-2

Another "workaround" is to use IPC/RPC and run isolated packages in services. If the dependencies are on the different libraries, maybe can separate by the usage of the packages.

Brian Ng
  • 1,005
  • 12
  • 13