1

Is it possible to reuse tensors in multiple tf-graphs, even after they are reset?


Problem:

I have a large dataset that I want to evaluate with many different tf-graphs. For each evaluation, tensorflow is reset with tf.compat.v1.reset_default_graph() and initialized completely from scratch.

Imho, it seems kind of dull and slow to call the data-to-tensor procedure every time, so I thought I could just define the data-tensor once and use it for all future evaluation.

Unfortunately, reusing tensors does not seem to be possible, as 'Tensor must be from the same graph as Tensor'.

ValueError: Tensor("Const:0", shape=(1670,), dtype=float32, device=/device:GPU:0) must be from the same graph as Tensor("Const_1:0", shape=(1670,), dtype=float32).

Is it possible to reuse these tensors somehow?

Who Knows
  • 68
  • 5

1 Answers1

0

Check out this answer in another answered on another questio. https://stackoverflow.com/a/42616834/13514201

TensorFlow stores all operations on an operational graph. This graph defines what functions output to where, and it links it all together so that it can follow the steps you have set up in the graph to produce your final output. If you try to input a Tensor or operation on one graph into a Tensor or operation on another graph it will fail. Everything must be on the same execution graph.

Try removing with tf.Graph().as_default():

Community
  • 1
  • 1
StackPancakes
  • 284
  • 2
  • 12
  • 1
    Thank you! I am not using `as_default` currently though. Also, the tensors are only constants, no operators at all. Do you think the gpu-option is a problem? I think maybe the config could be wrong, Ill check that first – Who Knows May 11 '20 at 19:24
  • Hmm.. It's hard to know. Another way to debug this is to print the identity of each graph like this: `print(variable_name.graph)` What version tensorflow are you running? – StackPancakes May 11 '20 at 19:41