I have hypothetical graph which has a series of computations as follows:
a_0 = tf.placeholder()
a_1 = some_op_1(a_0)
a_2 = some_op_2(a_1)
a_3 = some_op_3(a_2)
Observe that when computing a_3
; a_0
and a_1
are not needed and hence they can be discarded prior to allocating memory for a_3
. Is there some way to ask TensorFlow to perform this memory optimization (agree that there is some cost in time)?
Please note that this is not the same as this question about allocating memory only when needed.
EDIT: This network will not be trained, so don't worry about backprop.