8

After a recent upgrade, when running my PyTorch loop, I now get the warning

Using a non-full backward hook when the forward contains multiple autograd Nodes`".

The training still runs and completes, but I am unsure where I am supposed to place the register_full_backward_hook function.

I have tried adding it to each of the layers in my neural network but this gives further errors about using different hooks.

Can anyone please advise?

Ivan
  • 34,531
  • 8
  • 55
  • 100
IllyShaieb
  • 111
  • 2
  • 8

1 Answers1

2

PyTorch version 1.8.0 deprecated register_backward_hook (source code) in favor of register_full_backward_hook (source code).

You can find it in the patch notes here: Deprecated old style nn.Module backward hooks (PR #46163)

The warning you're getting:

Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior.

Simply indicates that you should replace all register_backward_hook calls with register_full_backward_hook in your code to get the behavior described in the documentation page.

Ivan
  • 34,531
  • 8
  • 55
  • 100