2

Both TensorFlow and Theano do not seem to support cyclic computational graphs, cyclic elements are implemented as recurrent cells with buffer and unrolling (RNN / LSTM cells), but this limitation is mostly related with the computation of back-propagation. I don't have a particular need for computing back-propagation but just the forward propagations.

Is there a way to ignore this limitation, or perhaps just to break down arbitrary computational graphs in acyclic components?

diffeomorphism
  • 991
  • 2
  • 10
  • 27

1 Answers1

3

TensorFlow does support cyclic computation graphs. The tf.while_loop() function allows you to specify a while loop with arbitrary subgraphs for the condition and the body of the loop, and the runtime will execute the loop in parallel. The tf.scan() function is a higher-level API that is similar to Theano's theano.scan() function. Both allow you to loop over tensors of dynamic size.

mrry
  • 125,488
  • 26
  • 399
  • 400
  • thanks. I've added an example (http://stackoverflow.com/q/37566925/1792701) of a graph where I am trying to use the previous value of the tensor as an input, but I get that the tensor has value zero on each run, rather than preserving its previous value. Should I just replace `sess.run` with `while_loop`? – diffeomorphism Jun 01 '16 at 11:11