50

I'm having trouble recovering a tensor by name, I don't even know if it's possible.

I have a function that creates my graph:

def create_structure(tf, x, input_size,dropout):    
 with tf.variable_scope("scale_1") as scope:
  W_S1_conv1 = deep_dive.weight_variable_scaling([7,7,3,64], name='W_S1_conv1')
  b_S1_conv1 = deep_dive.bias_variable([64])
  S1_conv1 = tf.nn.relu(deep_dive.conv2d(x_image, W_S1_conv1,strides=[1, 2, 2, 1], padding='SAME') + b_S1_conv1, name="Scale1_first_relu")
.
.
.
return S3_conv1,regularizer

I want to access the variable S1_conv1 outside this function. I tried:

with tf.variable_scope('scale_1') as scope_conv: 
 tf.get_variable_scope().reuse_variables()
 ft=tf.get_variable('Scale1_first_relu')

But that is giving me an error:

ValueError: Under-sharing: Variable scale_1/Scale1_first_relu does not exist, disallowed. Did you mean to set reuse=None in VarScope?

But this works:

with tf.variable_scope('scale_1') as scope_conv: 
 tf.get_variable_scope().reuse_variables()
 ft=tf.get_variable('W_S1_conv1')

I can get around this with

return S3_conv1,regularizer, S1_conv1

but I don't want to do that.

I think my problem is that S1_conv1 is not really a variable, it's just a tensor. Is there a way to do what I want?

protas
  • 617
  • 1
  • 5
  • 10

4 Answers4

58

There is a function tf.Graph.get_tensor_by_name(). For instance:

import tensorflow as tf

c = tf.constant([[1.0, 2.0], [3.0, 4.0]])
d = tf.constant([[1.0, 1.0], [0.0, 1.0]])
e = tf.matmul(c, d, name='example')

with tf.Session() as sess:
    test =  sess.run(e)
    print e.name #example:0
    test = tf.get_default_graph().get_tensor_by_name("example:0")
    print test #Tensor("example:0", shape=(2, 2), dtype=float32)
apfalz
  • 768
  • 6
  • 9
  • 3
    For reference, if you need to get an OP instead of a tensor: http://stackoverflow.com/questions/42685994/how-to-get-a-tensorflow-op-by-name – David Parks Mar 23 '17 at 20:08
  • 1
    @apfalz How to arrive at "example:0" for the tensor name? – Bosen Jul 04 '17 at 09:30
  • 1
    Not sure if I get your question but, in the above example, if you print `e.name` you would know the name is `example:0`. Tensorflow adds the `:0` to the name you specify. – apfalz Jul 06 '17 at 14:20
  • Hi, thanks. Could you please tell me can I use `get_tensor_by_name` to get something defined by `tf.layers.dense`, e.g., `means` in this code sample [here](https://gist.github.com/Erichliu00/1ce345e548b31cf1f2a6efed34ba9dec). – ytutow Aug 26 '17 at 02:44
  • I don't know how defining `tf.variable_scopes` inside of python functions complicates matters, but you could try using `tf.get_collection(key, scope=None)`. Alternatively you could try something like `mean_vars = [i for i in tf.global_variables() if i.name.startswith('mean')]` or potentially `mean_vars = [i for i in tf.global_variables() if 'mean' in i.name]` I haven't tested the second one so no promises. That would give you the full name as tensorflow understands it. From there `tf.get_tensor_by_name()` should work fine. – apfalz Aug 31 '17 at 16:22
34

All tensors have string names which you can see as follows

[tensor.name for tensor in tf.get_default_graph().as_graph_def().node]

Once you know the name you can fetch the Tensor using <name>:0 (0 refers to endpoint which is somewhat redundant)

For instance if you do this

tf.constant(1)+tf.constant(2)

You have the following Tensor names

[u'Const', u'Const_1', u'add']

So you can fetch output of addition as

sess.run('add:0')

Note, this is part not part of public API. Automatically generated string tensor names are an implementation detail and may change.

Yaroslav Bulatov
  • 57,332
  • 22
  • 139
  • 197
  • 1
    Thanks for the help, but that not really my problem. My tensor has been explicitly named ""Scale1_first_relu", but I can't get a reference to it outside the function where it has been declared. I can, however, get a reference to the variable "W_S1_conv1". Are tensors local? Do they exist outside the function where they are created? – protas Apr 14 '16 at 14:22
  • Tensors exist in a Graph. If you use default Graph, it's shared between all functions in the same thread. The `Scale1_first_relu` is more of a "suggestion" rather than actual name. It''s true name name in a graph may have a prefix if it was created inside a scope or it may have a suffix automatically added for deduping. Just print the name of the tensors in the graph using recipe above and search for tensors containing `Scale1_first_relu` string – Yaroslav Bulatov Apr 14 '16 at 16:56
  • I've tried that already. The complete name is "scale_1\Scale1_first_relu:0". Searching for that still gives me an error, but searching for 'scale_1\W_S1_conv1' works. It's like the tensor no longer exists. – protas Apr 14 '16 at 21:36
  • If the name is `scale_1\Scale1_first_relu:0` then you can get its value with `session.run('scale_1\Scale1_first_relu:0')` – Yaroslav Bulatov Apr 14 '16 at 23:36
  • That's not exactly what I want and I'm not even sure this will work, but thanks for the effort. We have found a way around that, turns out returning S1_conv1 from the function is not much of a problem. I apologize if the question wasn't clear enough. – protas Apr 15 '16 at 18:46
0

All you gotta do in this case is:

ft=tf.get_variable('scale1/Scale1_first_relu:0')
0

Or simpler still, infer it from the corresponding .pbtxt file that comes with the model .pb file. Since it depends on the model, every case is different.

Kris Stern
  • 1,192
  • 1
  • 15
  • 23