0

I tried deploying the tensor flow for poets model to cloudML but the export is not suited for online predictions. Does anybody has an idea how I can alter the the script for deployment and online prediction: https://github.com/tensorflow/hub/blob/master/examples/image_retraining/retrain.py

I found this script online (from rhaertel80) but is with an older retrain script and I don't think it's working: Deploying and predicting the tensorflow for poets on google-cloud-ml

import tensorflow as tf
from tensorflow.contrib import layers

from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import signature_def_utils
from tensorflow.python.saved_model import tag_constants
from tensorflow.python.saved_model import utils as saved_model_utils
import tensorflow.python.saved_model.simple_save


export_dir = 'my_model2'
retrained_graph = 'retrained_graph.pb'
label_count = 5

class Model(object):
def __init__(self, label_count):
    self.label_count = label_count

def build_prediction_graph(self, g):
    inputs = {
        'key': keys_placeholder,
        'image_bytes': tensors.input_jpeg
    }

    keys = tf.identity(keys_placeholder)
    outputs = {
        'key': keys,
        'prediction': g.get_tensor_by_name('final_result:0')
    }

    return inputs, outputs

def export(self, output_dir):
    with tf.Session(graph=tf.Graph()) as sess:
        # This will be our input that accepts a batch of inputs
        image_bytes = tf.placeholder(tf.string, name='input', shape=(None,))
        # Force it to be a single input; will raise an error if we send a batch.
        coerced = tf.squeeze(image_bytes)
        # When we import the graph, we'll connect `coerced` to `DecodeJPGInput:0`
        input_map = {'DecodeJpeg/contents:0': coerced}

        with tf.gfile.GFile(retrained_graph, "rb") as f:
            graph_def = tf.GraphDef()
            graph_def.ParseFromString(f.read())
            tf.import_graph_def(graph_def, input_map=input_map, name="")

        keys_placeholder = tf.placeholder(tf.string, shape=[None])

        inputs = {'image_bytes': image_bytes, 'key': keys_placeholder}

        keys = tf.identity(keys_placeholder)
        outputs = {
            'key': keys,
            'prediction': tf.get_default_graph().get_tensor_by_name('final_result:0')}    

        tf.saved_model.simple_save(sess, output_dir, inputs, outputs)

model = Model(label_count)
model.export(export_dir)
  • Can you provide details on the failures you are experiencing? It also usually helps to try with `gcloud ml-engine local predict` – rhaertel80 Jun 13 '18 at 16:24
  • I tried to use the model exported with the export_model function. It states that it is the model for serving. When I use the model for online predictions in cloudML. ""error": "Prediction failed: Error processing input: Expected float32, got "..." of type 'unicode' instead. I need to add a preprocessing for the image data to the graph? But I don't know how to add it to graph when I export it. I used the tf.hub code (line 947): https://github.com/tensorflow/hub/blob/master/examples/image_retraining/retrain.py (also I read that you mentioned somewhere that this code is not suited for cloudML?) – Christiaan Jun 13 '18 at 22:32
  • How are you sending your input? (Raw Tensor Encoded as JSON, Tensors Packed as Byte Strings, Compressed Image Data). Do the input placeholders have 'None' as the outer-dimension of their shape? (e.g., `input_images = tf.placeholder(dtype=tf.float32, shape=[None,320,240,3], name='source')` ). – Héctor Neri Oct 17 '18 at 17:03

0 Answers0