1

I have a trained convolutional neural network A that outputs the propability that a given picture contains a square or a circle.

Another Network B takes images of random noise. My idea is to have a bunch of convolutional layers so that the output is a newly generated square. As an error function I would like to feed the generated image into A and learn filters of B from the softmax tensor of A. To my understanding this is sort of a generative adversarial network, except for that A does not learn. While trying to implement this I have encountered two problems.

  1. I have imported the Layers of A that I want to use in B as followed:

    with gfile.FastGFile("shape-classifier.pb", 'rb') as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
        image_input_layer, extern_softmax_tensor = tf.import_graph_def(
            graph_def, name="", return_elements=["image_input", "Softmax"])
    

    I would like to avoid using two sess.run() three times. (Generating the random image, getting the softmax values from A, adjusting weights of B). Is there a way to directly connect the tensors so that I only have one graph?

    Calling:

    logits = extern_softmax_tensor(my_generated_image_tensor)
    

    throws:

    TypeError: 'Operation' object is not callable
    

    The "Graph-Connected" and the "Feed-Connected" approach confuse me a bit.

  2. logits = extern_softmax_tensor(my_generated_image_tensor) # however you would call it
    cross_entropy = tf.nn.softmax_cross_entropy_with_logits(labels=label_input,
                                                            logits=logits)
    cross_entropy_mean = tf.reduce_mean(cross_entropy_tensor)
    optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
    learning_step = optimizer.minimize(cross_entropy_mean)
    

    With that Logic the error will be first passed back through A. Is there a way to use the softmax calculated by A to directly adjust Layers of B?

Leaving aside if my idea actually works, is it actually possible to build it in tensorflow? I hope I could make my problems clear.

Thank you very much

jdehesa
  • 58,456
  • 7
  • 77
  • 121
fst
  • 33
  • 5

1 Answers1

0

Yes, it is possible to connect two models like that.

# Generator networ
my_generated_image_tensor = ...
# Read classifier
with gfile.FastGFile("shape-classifier.pb", 'rb') as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())
    # Map generator output to classifier input
    extern_softmax_tensor, = tf.import_graph_def(
        graph_def, name="", input_map={"image_input": my_generated_image_tensor},
        return_elements=["Softmax:0"])
# Define loss and training step
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(
    labels=label_input, logits=extern_softmax_tensor)
cross_entropy_mean = tf.reduce_mean(cross_entropy_tensor)
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
learning_step = optimizer.minimize(cross_entropy_mean)

As side notes: 1) tf.nn.softmax_cross_entropy_with_logits expects logits as input, that is, the values before being passed through the softmax function, so if the Softmax tensor in the loaded model is the output of a softmax operation you should probably change it to use the input logits instead. 2) Note that tf.nn.softmax_cross_entropy_with_logits is now deprecated anyway, see tf.nn.softmax_cross_entropy_with_logits_v2.

jdehesa
  • 58,456
  • 7
  • 77
  • 121