I've just started playing with TensorFlow and I'm trying to implement a very simple RNN. The RNN has x
as input, y
as output and consists of just a single layer that takes x
and it's previous output as input. Here's a picture of the sort of thing I have in mind:
The problem is, I can't see any way through the TensorFlow API to construct a graph with a cycle in it. Whenever I define a Tensor I have to specify what it's inputs are, which means I have to have already have defined it's inputs. So there's a chicken-and-egg problem.
I don't even know if it makes sense to want to define a graph with a cycle (What gets computed first? Would I have to define an initial value of the softmax node?). I played with the idea of using a variable to represent the previous output and then manually take the value of y
and store it in the variable every time after feeding through a training sample. But that would be very slow unless there's a way to represent this procedure in the graph itself (?).
I know the TensorFlow tutorials show example implementations of RNNs but they cheat and pull an LSTM module out of the library which already has the cycle in it. Overall the tutorials are good for stepping you through how to build certain things but they could do a better job of explaining how this beast really works.
So, TensorFlow experts, is there a way to build this thing? How would I go about doing it?