10

I have a model that, based on certain conditions, has some unconnected gradients, and this is exactly what I want. But Tensorflow is printing out a Warning every time it encounters the unconnected gradient.

WARNING:tensorflow:Gradients do not exist for variables

Is there any way to only suppress this specific warning? I don't want to blindly suppress all warnings since there might be unexpected (and potentially useful) warnings in the future as I'm still working on my model.

name.disp
  • 252
  • 1
  • 7

1 Answers1

6

Kinda hacky way:

gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients([
    (grad, var) 
    for (grad, var) in zip(gradients, model.trainable_variables) 
    if grad is not None
])
Jared Nielsen
  • 3,669
  • 9
  • 25
  • 36
  • `apply_gradients` accepts iterators too (like `zip`), so your code also works without the `[ ]` that turns the generator expression into a list comprehension. – EliadL Jul 07 '20 at 05:11