I am working with the following snippet of tensorflow related code:
def train_step(emb, y_onehot, step):
with tf.GradientTape() as tape:
logits = model(emb, training=True)
assert model.trainable_variables
logits.shape.assert_is_compatible_with(y_onehot.shape)
loss_value = loss_obj(y_true=y_onehot, y_pred=logits)
grads = tape.gradient(loss_value, model.trainable_variables) # tape!
opt.apply_gradients(zip(grads, model.trainable_variables))
The line marked tape!
is used outside/after the with
resource management scope. I had actually thought this would have been a syntax error. So does it become a normal variable at that point? Are there other implications of this usage?