0

I am working with the following snippet of tensorflow related code:

def train_step(emb, y_onehot, step):
    with tf.GradientTape() as tape:
      logits = model(emb, training=True)
      assert model.trainable_variables
      logits.shape.assert_is_compatible_with(y_onehot.shape)
      loss_value = loss_obj(y_true=y_onehot, y_pred=logits)
    grads = tape.gradient(loss_value, model.trainable_variables) # tape!
    opt.apply_gradients(zip(grads, model.trainable_variables))

The line marked tape! is used outside/after the with resource management scope. I had actually thought this would have been a syntax error. So does it become a normal variable at that point? Are there other implications of this usage?

10 Rep
  • 2,217
  • 7
  • 19
  • 33
WestCoastProjects
  • 58,982
  • 91
  • 316
  • 560
  • I expect that you'll get a runtime error. The variable is still in scope, but the object associated with it will have been closed (or whatever context the `with` is handling). – Carcigenicate Jul 19 '20 at 17:37
  • It will very much depend on that `tf.GradientTape()` does here. A file pointer can still exists outside the with-block, but you can't actually access the file's contents anymore. Here, I don't know, but given that `tape` is only used outside the with-block, this seems like rather awkward code in the first place. – 9769953 Jul 19 '20 at 17:37
  • https://stackoverflow.com/a/45100308/4588779 – Ankush Jul 19 '20 at 17:37
  • @Ankush Nice find; that's probably the duplicate of this question. – 9769953 Jul 19 '20 at 17:39

0 Answers0