0

I wonder if it is possible to monitor the percentage of nonzero weights of the full network (not just a layer) during training?

For example, I use

optim = AdagradDAOptimizer(learning_rate=0.01).minimize(my_loss)

and

for i in range(10):
  sess = tf.Session()
  loss, _ = sess.run([my_loss, optim])

and I would like to print the ratio of the number nonzero weights over the number of all weights after every iteration. Is it possible?

Nick
  • 13
  • 6

1 Answers1

0

The following code calculate the number of nonzero weights.

import tensorflow as tf
import numpy as np

tvars = sess.run(tf.trainable_variables())
nonzero_parameters = np.sum([np.count_nonzero(var) for var in tvars])

Here shows how to calculate the total number of weights: How to count total number of trainable parameters in a tensorflow model?.

Nick
  • 13
  • 6