1

I am working on a python library and we are currently using pip-tools to pin dependencies.

E.g. a basic requirements.in:

black
pandas
torch

Running pip-compile on this file will generate a requirements.txt something like the following (which has been simplified here):

black==20.8b1             # via -r requirements.in
click==7.1.2              # via black
numpy==1.19.4             # via pandas, torch
pandas==1.1.5             # via -r requirements.in
pathspec==0.8.1           # via black
pytz==2020.4              # via pandas
regex==2020.11.13         # via black
torch==1.7.1              # via -r requirements.in

My problem is that it is not recommended to pin versions for packages that will become public on pypi due to the lack of flexibility with other libraries for end users. It appears to be common practice to specify dependency ranges in these cases e.g.

numpy >=1.18.0, <1.19.0
pandas >=1.0.0
etc.

My question is whether there is any way to auto generate these dependency ranges given a requirements.in like the above? I imagine this should be relatively easy using existing dependency resolution tools to generate the minimum and maximum versions that are compatible with all libraries. (Any libraries which don't have any versions specified can be left as is.)

I haven't been able to find any tool that does anything like this. Are maintainers of other libraries performing this task manually? I understand the approach I describe above still requires some manual intervention for libraries in requirements.in that do not have cross-dependencies but for the most part (certainly in my case), this should do the heavy lifting.

amin_nejad
  • 989
  • 10
  • 22
  • 2
    Yes, you should write the custom version ranges for your top level dependencies manually in your `requirements.in`. It is near impossible to automate this task. It would be feasible if every project used reliable semantic versioning for example, but it is not the case, so this has to be done manually. – sinoroc Dec 15 '20 at 07:32
  • I guess I'm a little surprised but thanks for confirming that @sinoroc – amin_nejad Dec 15 '20 at 12:08
  • I am not sure what you expect to be done automatically. If you don't give any restrictions, then installers (pip, poetry, dephell, etc.) will figure out 1 compatible combination automatically, but they don't compute all compatible combinations (or ranges). It is your job as the project's authors to declare the version ranges for the top level libraries. The same way you carefully selected the direct dependencies, you should also carefully select the version ranges of those direct dependencies. I do not know of any tool that could do that automatically (not just in Python). – sinoroc Dec 15 '20 at 12:44
  • For instance, `pip-tools` doesn't just compute 1 compatible combination, it can actually compute the _latest_ compatible combination for all libraries where possible each time `pip-compile --upgrade` is run. Since it's already doing this, I'm thinking it should also be able to compute the oldest possible versions too (again only where this is specified of course in another library). – amin_nejad Dec 15 '20 at 16:52
  • Maybe. Maybe it is possible to get a lower bound. I don't know, maybe it is possible, I should think about it. -- Honestly I am not so sure there is much value in doing this though. Dependency resolution is quite an expensive operation. – sinoroc Dec 15 '20 at 20:58

0 Answers0