8

By names I'm referring to:

tf.placeholder(tf.float32, name='NAME')
tf.get_variable("W", [n_in, n_out],initializer=w_init())

I have several placeholders which I want to access from outside functions without passing the reference, with the assumption that placeholders holding the given names exist how can you get a reference to them? (this is all during graph construction, not runtime)

And my second question is how can I get all variables that hold a given name no matter the scope?

Example: All my weights have the name "W" under many scopes, I want to get them all into a list. I do not want to add each one manually. The same can be done with the biases, lets say I want to do a histogram.

rvinas
  • 11,824
  • 36
  • 58

1 Answers1

12

First of all, you can get the placeholder using tf.Graph.get_tensor_by_name(). For example, assuming that you are working with the default graph:

placeholder1 = tf.placeholder(tf.float32, name='NAME')
placeholder2 = tf.get_default_graph().get_tensor_by_name('NAME:0')
assert placeholder1 == placeholder2

Secondly, I would use the following function to get all variables with a given name (no matter their scope):

def get_all_variables_with_name(var_name):
    name = var_name + ':0'
    return [var for var in tf.all_variables() if var.name.endswith(name)]
Scott Skiles
  • 3,647
  • 6
  • 40
  • 64
rvinas
  • 11,824
  • 36
  • 58
  • Flawless, thanks. One thing I would like a note of, why the :0? And when if ever do you use :1? –  Aug 13 '16 at 19:12
  • 11
    `op_name:0` means "the tensor that is the 0th output of an operation called `op_name`." So you might use `…:1` to get the output of an operation with multiple outputs, but both `tf.placeholder()` and `tf.Variable` are single-output ops so you'll always use `…:0` for them. – mrry Aug 13 '16 at 19:35