2

In my tensorflow model, output of one network is a tensor. This value I need to feed as input to another pretrained network. I'm loading the pretrained network as follows

input_b_ph = tf.placeholder(shape=(), dtype=float.32, name='input_b_ph')
sess1 = tf.Session()
saver = tf.train.import_meta_graph(model_path.as_posix() + '.meta', input_map={'input/Identity:0': input_b_ph})
graph = tf.get_default_graph()
saver.restore(sess1, model_path.as_posix())
output_b = graph.get_tensor_by_name('output/Identity:0')

I need to feed a tensor to feature_input. How can I achieve this?

Edit 1: Adding end-to-end details:
I have a network A defined in tensorflow which takes input input_a and produces output output_a. This I need to feed to ResNet50 pretrained model. For this I used ResNet50 from tf.keras

from tensorflow.keras.applications.resnet50 import ResNet50
from tensorflow.keras.applications.resnet50 import preprocess_input

resnet_model = ResNet50(include_top=False, pooling='avg')
preprocessed_input = preprocess_input(tf.cast(output_a, tf.float32))
output_resnet = resnet_model([preprocessed_input])

Output of ResNet is output_resnet. This I need to feed to another pretrained network, say network B. B is actually written in Keras. I modified it to use tf.keras. Then I save the trained model as below:

import tensorflow as tf
from tensorflow import keras
curr_sess = keras.backend.get_session()
with tf.name_scope('input'):
    _ = tf.identity(quality_net.model.input)
with tf.name_scope('output'):
    __ = tf.identity(quality_net.model.output)
saver = tf.train.Saver()
saver.save(curr_sess, output_filepath.as_posix())

I have access to this network B and tried to save the model in h5 format but it gave error that model is thread lock. On searching in internet, I got to know this error comes when there are Lambda layers in the network. So, I resorted to saving the model in tensorflow format - 3 files: meta, weights and index. (Any solution using h5 format is also acceptible).

There is a caveat here. That the structure of network B can keep changing and it is from a different project. So I can't hardcode the architecture of B. I've to load it from saved model. My problem is how can I restore this pretrained model and pass output_resnet as input to network B. The output of network B i.e. output_b is the loss function to train my original network A. Currently I'm able to restore network B as follows:

input_b_ph = tf.placeholder(shape=(), dtype=float.32, name='input_b_ph')
sess1 = tf.Session()
saver = tf.train.import_meta_graph(model_path.as_posix() + '.meta', input_map={'input/Identity:0': input_b_ph})
graph = tf.get_default_graph()
saver.restore(sess1, model_path.as_posix())
output_b = graph.get_tensor_by_name('output/Identity:0')

I have output from resnet as output_resnet which is a tensor. I need a way to set this to input_b_ph. How can I achieve that? Any alternate solutions are also acceptible

Nagabhushan S N
  • 6,407
  • 8
  • 44
  • 87
  • 1
    I suppose this has to do with [your previous question](https://stackoverflow.com/q/59084901/1782792), but I'm not sure I understand the difference, or just what exactly you need to achieve in this case. You want to be able to do something like `session.run(out, feed_dict={feature_input: x})`, where `x` is a `tf.Tensor`? That is not possible (see e.g. [here](https://stackoverflow.com/q/42560209/1782792)), if you want to connect the output of one model to the input of the next, `feature_input` should be the `tf.Tensor` produced by the first model instead of a placeholder. – jdehesa Nov 28 '19 at 15:31
  • Yes, this is in continuation to previous question. Yes I know feeding tensor to placeholder is not possible. I've updated my question with my problem in detail. Please take a look at that and let me know if there is any way possible. Thanks. Ask me if you need any further clarifications. – Nagabhushan S N Nov 28 '19 at 16:53
  • 1
    Thanks for the additional detail. I'm sorry, though, I'm still not sure I understand completely. Can not you simply pass `input_map={'input/Identity:0': output_resnet}`? – jdehesa Nov 28 '19 at 17:23
  • Will that work? Shouldn't a placeholder be passed to input_map? I'll try that (tomorrow). Thanks – Nagabhushan S N Nov 28 '19 at 17:24
  • Yes, any tensor value should be okay, see if that works. – jdehesa Nov 28 '19 at 17:32
  • Yes. It did work. Thanks a ton! Would you mind writing it as answer. I'll accept it – Nagabhushan S N Nov 29 '19 at 14:20

1 Answers1

1

Mentioning the Answer in this (Answer) Section (although it is present in Comments Section), for the benefit of the Community.

Placeholder is not required in this case. Just passing output_resnet to input_map should resolve the issue.

Replacing the code,

saver = tf.train.import_meta_graph(model_path.as_posix() + '.meta', 
                                        input_map={'input/Identity:0': input_b_ph})

with

saver = tf.train.import_meta_graph(model_path.as_posix() + '.meta', 
                                      input_map={'input/Identity:0': output_resnet})

has resolved the issue.