7

I am trying to understand Pytorch autograd in depth; I would like to observe the gradient of a simple tensor after going through a sigmoid function as below:

import torch
from torch import autograd 

D = torch.arange(-8, 8, 0.1, requires_grad=True)

with autograd.set_grad_enabled(True):
    S = D.sigmoid()
S.backward()

My goal is to get D.grad() but even before calling it I get the runtime error:

RuntimeError: grad can be implicitly created only for scalar outputs

I see another post with similar question but the answer over there is not applied to my question. Thanks

A.E
  • 997
  • 1
  • 16
  • 33

1 Answers1

9

The error means you can only run .backward (with no arguments) on a unitary/scalar tensor. I.e. a tensor with a single element.

For example, you could do

T = torch.sum(S)
T.backward()

since T would be a scalar output.

I posted some more information on using pytorch to compute derivatives of tensors in this answer.

jodag
  • 19,885
  • 5
  • 47
  • 66