1

First of all I am very new to Python and machine learning, so please excuse my ignorance on what might be a very basic issue; I do appreciate any input on this question!

I have a very complicated scalar-valued multivariable function implemented in Python that uses Pytorch functionalities (it is actually a composition of a neural network and operations that depend on outputs from the network), and I wish to find the gradient vector and Hessian matrix of this function at certain points. Besides numdifftools which does finite difference and is not very fast and accurate when the dimension of input is high, are there alternatives? Something that seems promising is the the torch.autograd which I believe is used to compute the gradient of neural networks, however, can it compute the gradient and Hessian of any black-box function that runs Pytorch code? Any input is appreciated!

Longti
  • 111
  • 1
  • In general yes, but I have never tried to do that so I can't provide much detail. Perhaps [this](http://www.philipzucker.com/pytorch-trajectory-optimization-3-plugging-hessian/) will be helpful? – Jatentaki Dec 25 '18 at 09:40
  • Does this answer your question? [PyTorch most efficient Jacobian/Hessian calculation](https://stackoverflow.com/questions/56480578/pytorch-most-efficient-jacobian-hessian-calculation) – iacob Apr 04 '21 at 11:06

0 Answers0