I have a huge global variable used both by train and evaluation but with different shape. Now that I try to run both evaluation and train in the same process I stumbled on the fact that I can't really delete the variables defined in a tensorflow graph. The workaround suggested for instance here is to use the reset_default_graph()
but this does not seem to play with the graph context manager.
import numpy as np
import tensorflow as tf
GRAPH = tf.Graph()
def train(examples):
with GRAPH.as_default() as g:
# actually this is huge variable
global_var = tf.get_variable('global_var',
initializer=np.full((examples, 32), 0.0),
trainable=False)
def evaluate(examples):
# tf.reset_default_graph() # ValueError: Variable input_var already exists
with GRAPH.as_default() as g: # initialized to some other size
tf.reset_default_graph()
global_var = tf.get_variable('global_var',
initializer=np.full((examples, 32), 0.0),
trainable=False)
# in fact tensorflow creates a new graph and does not use GRAPH to define global_var
train(32)
evaluate(8)
Results in:
Traceback (most recent call last):
File "C:/Users/MrD/.PyCharm2017.1/config/scratches/scratch_44.py", line 22, in <module>
evaluate(8)
File "C:/Users/MrD/.PyCharm2017.1/config/scratches/scratch_44.py", line 19, in evaluate
trainable=False)
File "C:\_\Python35\lib\contextlib.py", line 66, in __exit__
next(self.gen)
File "C:\_\Python35\lib\site-packages\tensorflow\python\framework\ops.py", line 3616, in get_controller
if self.stack[-1] is not default:
IndexError: list index out of range
So what is the correct way to use reset_default_graph() ? Is there really no way to redefine a Variable discarding the old potentially huge initializer ?