5

I have trained an object detection model with a fasterR-CNN network and has the frozen_interface_graph.pb and label_map.pbtxt after training. I wanted to deploy it as a RESTAPI server so that it can be called from systems that do not have Tensorflow. That's when I came across TFX.

How I can use TFX tensorflow-model-server to load this model and host the RESTAPI so that I can send images for prediction as POST request?

https://www.tensorflow.org/tfx/tutorials/serving/rest_simple This is what I found as a reference, but the models are of a different format than what I have currently. Is there any mechanism in which I can reuse the model I currently have or will I have to retrain using Keras and deploy as shown in the reference.

Sreekiran A R
  • 3,123
  • 2
  • 20
  • 41
  • 1
    Can you provide the structure of your pb file or provide us the link where it can be accessed to help us inspect the possibility of reusing it. – TF_Support Apr 15 '20 at 08:50
  • Sharing drive link which contains the files. https://drive.google.com/open?id=1xGvgYln0mZondOMXgSdiM3nddozwMWOQ – Sreekiran A R Apr 15 '20 at 09:14
  • Hi @Sreekiran, Can you provide a sample image of your training set? – TF_Support Apr 16 '20 at 10:05
  • Hi, please find the link for a sample image. https://drive.google.com/file/d/14ODsJqu5S7OB0Paw4Nz8FBjNE0WSbZzp/view?usp=sharing – Sreekiran A R Apr 17 '20 at 08:47
  • Hi @TF_Support, have you found any way to help? – Sreekiran A R Apr 20 '20 at 10:18
  • Hi @Sreekiran, The problem is that model that you have doesn't have a signature that is needed in the TFX, Tried converting your model to have a signature but doesn't return any prediction. – TF_Support Apr 20 '20 at 10:23

1 Answers1

3

To reuse your model for TFX, a frozen graph needs to have a serving signature specified. Tried converting your model into savedmodel format using the code below which successfully created a savedmodel.pb file with a tag-set "serve".

import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = './saved'
graph_pb = 'frozen_inference_graph.pb'

builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.gfile.GFile(graph_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())

sigs = {}

with tf.Session(graph=tf.Graph()) as sess:
    # name="" is important to ensure we don't get spurious prefixing
    tf.import_graph_def(graph_def, name="")
    g = tf.get_default_graph()
    sess.graph.get_operations()
    inp = g.get_tensor_by_name("image_tensor:0")
    outputs = {}
    outputs["detection_boxes"] = g.get_tensor_by_name('detection_boxes:0')
    outputs["detection_scores"] = g.get_tensor_by_name('detection_scores:0')
    outputs["detection_classes"] = g.get_tensor_by_name('detection_classes:0')
    outputs["num_detections"] = g.get_tensor_by_name('num_detections:0')

    output_tensor = tf.concat([tf.expand_dims(t, 0) for t in outputs], 0)


    sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
        tf.saved_model.signature_def_utils.predict_signature_def(
            {"in": inp}, {"out": out})

    sigs["predict_images"] = \
    tf.saved_model.signature_def_utils.predict_signature_def(
        {"in": inp}, {"out": output_tensor} )

    builder.add_meta_graph_and_variables(sess,
                                         [tag_constants.SERVING],
                                         signature_def_map=sigs)

builder.save().

We have tested the converted model by predicting the sample image that you provided. The result doesn't show any prediction which probably means the conversion method doesn't work as expected.

As for your question:

"Is there any mechanism in which I can reuse the model I currently have or will I have to retrain using Keras and deploy as shown in the reference?"

With this result, it is way better to just retrain your model using Keras as the answer to your question because converting or reusing your frozen graph model isn't going to be the solution. Your model does not save variables that are required for serving the model and the model format is not suitable for a production environment. And yes, it is the best way to follow the official documentation as you will be assured that this would work.

TF_Support
  • 1,794
  • 1
  • 7
  • 16