I'm trying to write a custom training loop. Here is a sample code of what I'm trying to do. I have two training parameter and one parameter is updating another parameter. See the code below:
x1 = tf.Variable(1.0, dtype=float)
x2 = tf.Variable(1.0, dtype=float)
with tf.GradientTape() as tape:
n = x2 + 4
x1.assign(n)
x = x1 + 1
y = x**2
val = tape.gradient(y, [x1, x2])
for v in val:
print(v)
and the output is
tf.Tensor(12.0, shape=(), dtype=float32)
None
It seems like GradientTape is not watching the first(x2) parameter. Both parameter is tf.Variable
type, so GradientTape should watch both the parameter. I also tried tape.watch(x2)
, which is also not working. Am I missing something?