After a recent upgrade, when running my PyTorch loop, I now get the warning
Using a non-full backward hook when the forward contains multiple autograd Nodes`".
The training still runs and completes, but I am unsure where I am supposed to place the register_full_backward_hook
function.
I have tried adding it to each of the layers in my neural network but this gives further errors about using different hooks.
Can anyone please advise?