1

I would like to use GradientTape to observe gradients during eager execution mode. Is it possible to create a GradientTape once, which then records everything, as if it had global context?

Here is an example of what I would like to do:

import numpy as np
import tensorflow as tf

x = tf.Variable(np.ones((2,)))
y=2*x
z=2*y
tf.gradients(z, x) # RuntimeError, not supported in eager execution

Now, this can be fixed easily:

with tf.GradientTape() as g:
    y = 2*x
    z = 2*y
    
g.gradient(y, x) # this works

But the problem is that I often don't have the definitions of y and z immediately after each other. For example, what if the code is executed in a Jupyter notebook and they are in different cells?

Can I define a GradientTape that watches everything, globally?

Community
  • 1
  • 1
lhk
  • 27,458
  • 30
  • 122
  • 201

1 Answers1

2

I found this workaround:

import numpy as np
import tensorflow as tf

# persistent is not necessary for g to work globally
# it only means that gradients can be computed more than once,
# which is important for the interactive jupyter notebook use-case
g = tf.GradientTape(persistent=True)

# this is the workaround
g.__enter__()

# you can execute this anywhere, also splitted into separate cells
x = tf.Variable(np.ones((2,)))
y = 2*x
z = 2*y

g.gradient(z, x)
lhk
  • 27,458
  • 30
  • 122
  • 201