4

I have a huge global variable used both by train and evaluation but with different shape. Now that I try to run both evaluation and train in the same process I stumbled on the fact that I can't really delete the variables defined in a tensorflow graph. The workaround suggested for instance here is to use the reset_default_graph() but this does not seem to play with the graph context manager.

import numpy as np
import tensorflow as tf

GRAPH = tf.Graph()

def train(examples):
    with GRAPH.as_default() as g:
        # actually this is huge variable
        global_var = tf.get_variable('global_var',
                                     initializer=np.full((examples, 32), 0.0),
                                     trainable=False)

def evaluate(examples):
    # tf.reset_default_graph() # ValueError: Variable input_var already exists
    with GRAPH.as_default() as g: # initialized to some other size
        tf.reset_default_graph() 
        global_var = tf.get_variable('global_var',
                                     initializer=np.full((examples, 32), 0.0),
                                     trainable=False)
       # in fact tensorflow creates a new graph and does not use GRAPH to define global_var

train(32)
evaluate(8)

Results in:

Traceback (most recent call last):
  File "C:/Users/MrD/.PyCharm2017.1/config/scratches/scratch_44.py", line 22, in <module>
    evaluate(8)
  File "C:/Users/MrD/.PyCharm2017.1/config/scratches/scratch_44.py", line 19, in evaluate
    trainable=False)
  File "C:\_\Python35\lib\contextlib.py", line 66, in __exit__
    next(self.gen)
  File "C:\_\Python35\lib\site-packages\tensorflow\python\framework\ops.py", line 3616, in get_controller
    if self.stack[-1] is not default:
IndexError: list index out of range

So what is the correct way to use reset_default_graph() ? Is there really no way to redefine a Variable discarding the old potentially huge initializer ?

Mr_and_Mrs_D
  • 32,208
  • 39
  • 178
  • 361
  • [This](https://stackoverflow.com/questions/39352865/resetting-default-graph-does-not-remove-variables) answers why it doesn't work, not sure if the proposed solution also fits your problem. – P-Gn Jul 01 '17 at 17:32
  • Thanks @user1735003 - any ideas how I could then dispose of this variable (can't use placeholders for performance reasons) ? – Mr_and_Mrs_D Jul 01 '17 at 18:29
  • Perhaps `tf.assign` could work for you? (with `validate_shape=False`) – P-Gn Jul 01 '17 at 18:59

1 Answers1

2

Turns out that it does not make sense to "reset the default graph" inside a graph context manager - see: https://github.com/tensorflow/tensorflow/issues/11121. Newer versions should add a more helpful error message:

AssertionError: Do not use tf.reset_default_graph() to clear nested graphs. If you need a cleared graph, exit the nesting and create a new graph.

as discussed in the issue above and implemented here

Mr_and_Mrs_D
  • 32,208
  • 39
  • 178
  • 361