After getting the grad_and_var
tuple with compute_gradient
:
opt = tf.train.RMSPropOptimizer(learning_rate)
grad_and_var = opt.compute_gradients(losses, params)
I'd like to clip the grad_and_var
. But when I do:
clipped_gradients, _ = tf.clip_by_global_norm(grad_and_var, max_gradient_norm)
directly, the resulting clipped_gradients
is a list of tensor, that means, the gradient and the variables has been concatenated.
If I do
clipped_gradients = [tf.clip_by_global_norm(x[0], max_gradient_norm)[0] for x in grad_and_var]
I got such an error:
TypeError: t_list should be a sequence
Do you have any idea that how can I fix it? Thanks a lot!