0
# global variables initialiser
# only w,b 
init= tf.global_variables_initializer()

with tf.Session() as session:

    session.run(init)
.....

I want a way(operation or any attribute of tf.global_variables_initializer()) to print information about w,b here.This help to keep check of How many variables I have defined in my graph when it goes bigger.? Any suggestion will be helpful.

Anu
  • 3,198
  • 5
  • 28
  • 49

1 Answers1

1

As documented:

tf.global_variables_initializer() is just a shortcut for variables_initializer(global_variables())

Therefore, call tf.global_variables() will give you a list of initialized variables. You can evaluate these variables or do whatever you like.

Y. Luo
  • 5,622
  • 1
  • 18
  • 25
  • It worked, but I am curious to know why when I am executing the [program](https://stackoverflow.com/questions/49778603/tf-print-doesnt-print-the-shape-of-the-tensors) in Jupiter notebook, the number of variables are getting created in the increment of 2(weights,bias). According to my understanding it should not increase as in the default session is closing after every execution.? `print('Total number of global variables {0}'.format(len(tf.global_variables()))) for i in tf.global_variables(): print(i.eval().shape)` Any suggestion? – Anu Apr 15 '18 at 02:55
  • @Anubhav I'm not sure but I think it is related to how you actually "execute" the program in Jupyter. If you run without restarting kernel, I think it is like you are using python interpreter with interactive mode. By running the code multiple times, you are actually adding tenors to the graph rather than "replacing" tensor. Therefore, I think it makes sense that "the number of variables are getting created in the increment of 2" (which is the number of variables in "one" run of your code). – Y. Luo Apr 15 '18 at 04:35