3

If I want to give a custom or known gradient for a function, how can I do that in the ad library? (I don't want to autodifferentiate through this function.) I am using the grad function in this library.

If the library doesn't provide this feature, is there some way I can easily implement this functionality myself, perhaps by changing the definitions of leaf nodes in ad or by editing the dual numbers that presumably carry the numerical gradients?

Here's a concrete example of what I mean:

Say I have some function I want to take the gradient of, say f(x, y) = x^2 + 3 * g(x, y)^2. Then say that g(x, y) is a function whose definition is complicated, but whose gradient I've already calculated analytically and is quite simple. Thus, when I take grad f and evaluate it at a point (x, y), I'd like to just plug in my custom gradient for g, instead of autodiffing through it: something like my_nice_grad_of_g (x, y).

I see other autodiff libraries do provide this feature, for example Stan and Tensorflow both allow users to define gradients of a function.

kye
  • 443
  • 3
  • 8

0 Answers0