3

How could I update PyTorch from 1.4 -> 1.5 using Anaconda either through terminal or navigator?

Updating Anaconda with conda update --all updated some of the packages, not all, PyTorch included.

Initially, I installed PyTorch by running conda install -c pytorch pytorch

From PyTorch Github page there's the command

conda install -c pytorch magma-cuda90 # or [magma-cuda92 | magma-cuda100 | magma-cuda101 ] depending on your cuda version

but I wonder if there's going to be any kind of conflict between the already installed version and this one.

Thanks

Berriel
  • 12,659
  • 4
  • 43
  • 67
vpap
  • 1,389
  • 2
  • 21
  • 32

1 Answers1

4

PyTorch latest stable release (as of April 06, 2020) is still 1.4, as you can see here.

Therefore, if you want to install the nightly build (which is on track to the 1.5) using conda, you can follow the official instructions:

  • Cuda 10.1:
conda install pytorch torchvision cudatoolkit=10.1 -c pytorch-nightly -c defaults -c conda-forge
  • Cuda 9.2:
conda install pytorch torchvision cudatoolkit=9.2 -c pytorch-nightly -c defaults -c conda-forge -c numba/label/dev

or the CPU-only version:

conda install pytorch torchvision cpuonly -c pytorch-nightly -c defaults -c conda-forge

Or, you can just wait 1.5 to be an stable release (currently, we are in the release candidate 2) and update the pytorch package as you'd have done otherwise.


Be aware that:

PyTorch 1.4 is the last release that supports Python 2

So, if you're moving to PyTorch 1.5, say goodbye to Python 2 (Yay!!).

Berriel
  • 12,659
  • 4
  • 43
  • 67
  • Thanks!!!! The problem with PyTorch 1.4 is that setting kernel size or dilation is not robust. Whereas in PyTorch 1.5 it is robust i.e. automatically padding is applied to the input image if final( after dilation) kernel size is greater than the input image. This is concluded by checking the related code in both releases. Staying in 1.4, requires updating the problematic code. What's your take on that? – vpap Apr 07 '20 at 01:01
  • I've used the nightly build in many projects when I needed it. I didn't check these pull requests specifically, but if they help you, go for it :) Don't forget to upvote if this answer has helped you. And goodbye Python 2 :) – Berriel Apr 07 '20 at 01:17
  • @vpap Just to add some context, next releases (at least before `2.x` I suppose) should provide backwards compatibility (only next `major` would break previous behaviour, there are some exceptions though, see [optimizer and scheduler answer](https://stackoverflow.com/questions/59017023/pytorch-learning-rate-scheduler/59017297#59017297_) so the changes you are getting should be improving your user experience. If it's not the case and serious regression occured you should make an Issue on official PyTorch repo. – Szymon Maszke Apr 08 '20 at 00:50