2

I have a model u(x,t) with layers 2X50, then 50X50, and 50X1.

I train the model with input x,t of size [100,2]. In the final layer I get u and now I want to differentiate it w.r.t to x and t and double differentiate w.r.t to x. How do I do this in PyTorch?

iacob
  • 20,084
  • 6
  • 92
  • 119
Sunny
  • 35
  • 3

1 Answers1

4

You can use PyTorch's autograd engine like so:

import torch

x = torch.randn(100, requires_grad=True)
t = torch.randn(2, requires_grad=True)
u = u(x,t)

# 1st derivatives
dt = torch.autograd.grad(u, t)[0]
dx = torch.autograd.grad(u, x, create_graph=True)[0]

# 2nd derivatives (higher orders require `create_graph=True`)
ddx = torch.autograd.grad(dx, x)[0]
iacob
  • 20,084
  • 6
  • 92
  • 119
  • Thanks, but still i am getting error. File "C:\Users\Sunny Raghav\anaconda3\lib\site-packages\torch\autograd_init_.py", line 34, in _make_grads raise RuntimeError("grad can be implicitly created only for scalar outputs") RuntimeError: grad can be implicitly created only for scalar outputs – Sunny Mar 19 '21 at 20:49
  • @Sunny Your function `u()` produces multiple outputs - PyTorch's gradients are designed to work on scalar functions. You'll need to coalesce them into a single value using [e.g. `.sum()`](https://stackoverflow.com/questions/58510249/pytorch-autograd-what-does-runtime-error-grad-can-be-implicitly-created-only-f). – iacob Mar 19 '21 at 20:52
  • I am trying to solve PDE for which i need to calculate differential wrt to t and x of u(x,t). – Sunny Mar 19 '21 at 20:59
  • @Sunny I understand that - you get the error message "RuntimeError: grad can be implicitly created only for scalar outputs" when you try to call `grad` on a function outputting a tensor with multiple values (e.g. `[0,1,0,1]`), it only supports differentiation of functions which output a single value (e.g. '1.34'). – iacob Mar 20 '21 at 08:51