13

What is the tf.stop_gradient() equivalent (provides a way to not compute gradient with respect to some variables during back-propagation) in pytorch?

aerin
  • 20,607
  • 28
  • 102
  • 140
  • 1
    Do any of these answer your question? https://datascience.stackexchange.com/questions/32651/what-is-the-use-of-torch-no-grad-in-pytorch https://stackoverflow.com/questions/56816241/difference-between-detach-and-with-torch-nograd-in-pytorch/56817594 – Stef Sep 16 '20 at 14:30

2 Answers2

23

Could you check with x.detach().

Deepali
  • 271
  • 1
  • 7
7

Tensors in pytorch have requires_grad attribute. Set it to False to prevent gradient computation for that tensors.

Shai
  • 111,146
  • 38
  • 238
  • 371