0

I'm building TF graph,this is part of my code:

class Model:
  def __init__(self, neurons, links, input_neurons_num=4):
    """
    Constructor.
    :param neurons: array list of neurons
    :param input_neurons_num: number of input neurons
    """
    # neuron_id as key and weights entering in it as value
    self.weights = {}
    # neuron_id as key and neurons entering in it as value
    self.connections = {}
    self.graph = None

 def build_graph(self):
     with self.graph.as_default():
        operations = {}

        # create Variables for input vertices
        for neuron_id in self.input_neurons:
            self.inputs[neuron_id] = tf.get_variable(name=str(neuron_id), shape=(),
                                                     initializer=tf.zeros_initializer)

        # create input & output vertices
        for neuron_id in self.connections:
            input_neuron_ids = self.connections[neuron_id]

            # weights
            v_weights = tf.constant(self.weights[neuron_id])
            # input vertices
            v_inputs = []

            for input_neuron_id in input_neuron_ids:
                if self.is_input_neuron(input_neuron_id):
                    vertex = self.inputs[input_neuron_id]
                else:
                    # KeyError if input_neuron_id isn't alreay created
                    vertex = operations[input_neuron_id]

                v_inputs.append(vertex)


            # multiply weights and inputs
            mul = tf.multiply(v_inputs, v_weights, str(neuron_id))

So I have list of links, where each link has from_neuron, to_neuron and weight. For example: (1,2,3) => edge(connection) from 1 to 2 with weight 2. I want to iterate through all links and based on connections build graph.

At start I know input and output nodes. Idea is to iterate through links and gradually build graph. If there is node 4: (1,4,2), (2,4,3.5) I would like to create a tf.operation which will multiply output from 1 and its weight(2), output from 2 and its weight(3.5), sum obtain values and pass it forward through network. But problem is if I have input nodes: 1,2,3, and node 4 that has connection with etc. node 7 which is not created yet. It will try to reference node that isn't created yet, and I will get KeyError.

Then I tried to skip nodes that are connected with ones that aren't exists yet:

deletion = []
        while len(self.connections) > 0:
            for neuron_id in deletion:
                self.connections.pop(neuron_id, None)
            deletion = []
            # create input & output vertices
            for neuron_id in self.connections:
                # same logic with addition:
                deletion.append(neuron_id)

And this works, but problem is when I have cycle in graph. This will fall into infinite loop.

Only idea I have to solve this problem is to do this in two passes. In first pass to create all nodes in graph, and in second to replace them with actually values. I thought to use placeholders, but I'm not sure how to implement that.

So any help is welcome.

aL_eX
  • 1,453
  • 2
  • 15
  • 30
5ar
  • 578
  • 4
  • 12

1 Answers1

3

Building a graph wih a cycle is not (yet?) possible in Tensorflow, because computing the gradient becomes too difficult. The usual approach is to circumvent the problem by "unfolding" the graph a bit, as explained in the recurrent neural net tutorial. In most deep learning tasks it performs very well. See here another answer explaining this (still in the case of RNNs).

If you want a "pure" cyclic graph, maybe pytorch can help you.

EDIT (02/2020) :

With TF2.0 and Keras it's easier to build recurrent networks, which are basically cyclic graphs. However I believe behind the scenes it's still an unfolded graph, not a truly cyclic one.

gdelab
  • 6,124
  • 2
  • 26
  • 59
  • But I found some answers, etc. [this](https://stackoverflow.com/questions/37551389/cyclic-computational-graphs-with-tensorflow-or-theano) which says that cyclic is supported. I don't need backpropagation, I only want to compute forward pass. Network structure and weights are built with [NEAT](https://en.wikipedia.org/wiki/Neuroevolution_of_augmenting_topologies). A few people have also recommended me pythorch, so I will definitely have a look. – 5ar Feb 21 '18 at 15:50