12

I've been following the TensorFlow for Poets 2 codelab on a model I've trained, and have created a frozen, quantized graph with embedded weights. It's captured in a single file - say my_quant_graph.pb.

Since I can use that graph for inference with the TensorFlow Android inference library just fine, I thought I could do the same with Cloud ML Engine, but it seems it only works on a SavedModel model.

How can I simply convert a frozen/quantized graph in a single pb file to use on ML engine?

Mark McDonald
  • 7,571
  • 6
  • 46
  • 53

2 Answers2

23

It turns out that a SavedModel provides some extra info around a saved graph. Assuming a frozen graph doesn't need assets, then it needs only a serving signature specified.

Here's the python code I ran to convert my graph to a format that Cloud ML engine accepted. Note I only have a single pair of input/output tensors.

import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = './saved'
graph_pb = 'my_quant_graph.pb'

builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.gfile.GFile(graph_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())

sigs = {}

with tf.Session(graph=tf.Graph()) as sess:
    # name="" is important to ensure we don't get spurious prefixing
    tf.import_graph_def(graph_def, name="")
    g = tf.get_default_graph()
    inp = g.get_tensor_by_name("real_A_and_B_images:0")
    out = g.get_tensor_by_name("generator/Tanh:0")

    sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
        tf.saved_model.signature_def_utils.predict_signature_def(
            {"in": inp}, {"out": out})

    builder.add_meta_graph_and_variables(sess,
                                         [tag_constants.SERVING],
                                         signature_def_map=sigs)

builder.save()
Mark McDonald
  • 7,571
  • 6
  • 46
  • 53
  • I'm trying to do this, but someone gave me the checkpoint directory without the code. It seems like I need the names of the input and output nodes. Is there a way to get the input and output nodes from the info in the checkpoint directory? – blueether Jul 14 '17 at 07:19
  • 1
    yep use the inspect checkpoint tool: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/inspect_checkpoint.py – Mark McDonald Jul 14 '17 at 07:21
  • 3
    Thanks for the quick reply. When I ran it I got: `python inspect_checkpoint.py --file_name=checkpoint 2017-07-14 07:38:02.585722: W tensorflow/core/util/tensor_slice_reader.cc:95] Could not open ./checkpoint: Data loss: not an sstable (bad magic number): perhaps your file is in a different file format and you need to use a different restore operator? Unable to open table file ./checkpoint: Data loss: not an sstable (bad magic number): perhaps your file is in a different file format and you need to use a different restore operator?` – blueether Jul 14 '17 at 07:39
  • 1
    I tried out your code ! but the variables folder is empty. I'm using tensorflow hub to retrain an image classifier following [this](https://www.tensorflow.org/hub/tutorials/image_retraining) is the variables folder supposed to be empty ? (in some cases) – user 007 Feb 28 '19 at 16:24
  • @blueether - If the checkpoint tool isn't working, you can try loading the model in TensorBoard and inspecting it visually. Alternatively the checkpoint should have a .pbtxt file that contains the description of the model graph, you can either inspect it by hand or use tensorboard's graph viz element. I did the latter in [this repo](https://github.com/googlecodelabs/tensorflow-style-transfer-android/tree/gh-pages), you'll just need to replace the existing pbtxt file with yours. – Mark McDonald Mar 15 '19 at 02:10
  • @user007 I'm not sure how TF Hub fits in here, try asking a brand new question here to get some more visibility. Make sure you tag it with `tensorflow` and `tensorflow-hub`. – Mark McDonald Mar 15 '19 at 02:12
  • What if I wanted to also export the variables? Currently, this method does export any variables. – S. P May 29 '20 at 09:55
  • 2
    @MarkMcDonald what if I have multiple input and output nodes? – Divyang Vashi Sep 08 '20 at 06:29
1

There is sample with multiple outputs nodes:

# Convert PtotoBuf model to saved_model, format for TF Serving
# https://cloud.google.com/ai-platform/prediction/docs/exporting-savedmodel-for-prediction
import shutil
import tensorflow.compat.v1 as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = './1' # TF Serving supports run different versions of same model. So we put current model to '1' folder.
graph_pb = 'frozen_inference_graph.pb'

# Clear out folder
shutil.rmtree(export_dir, ignore_errors=True)

builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.io.gfile.GFile(graph_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())

sigs = {}

with tf.Session(graph=tf.Graph()) as sess:
    # Prepare input and outputs of model
    tf.import_graph_def(graph_def, name="")
    g = tf.get_default_graph()
    image_tensor = g.get_tensor_by_name("image_tensor:0")
    num_detections = g.get_tensor_by_name("num_detections:0")
    detection_scores = g.get_tensor_by_name("detection_scores:0")
    detection_boxes = g.get_tensor_by_name("detection_boxes:0")
    detection_classes = g.get_tensor_by_name("detection_classes:0")

    sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
        tf.saved_model.signature_def_utils.predict_signature_def(
            {"input_image": image_tensor}, 
            {   "num_detections": num_detections,
                "detection_scores": detection_scores, 
                "detection_boxes": detection_boxes, 
                "detection_classes": detection_classes})

    builder.add_meta_graph_and_variables(sess,
                                         [tag_constants.SERVING],
                                         signature_def_map=sigs)

builder.save()
Jarikus
  • 774
  • 8
  • 18
  • My frozen TF1 graph (that I want to convert to saved model) has the following tensors: ghostbin.com/MlDHG - what params do I need to specify using your script? – lepton Sep 17 '21 at 11:09