1

I'm a begginer in TensorFlow and I have some doubts about how to deploy my code in google cloud.

I ran de retrain.py in my local machine and I trainned the Inception with my own pictures. I have the graph (.pb and .meta). After that I develop another python script to test my model. Something like that:

import tensorflow as tf, sys

# change this as you see fit
image_path = sys.argv[1]

# Read in the image_data
image_data = tf.gfile.FastGFile(image_path, 'rb').read()

# Loads label file, strips off carriage return
label_lines = [line.rstrip() for line 
                   in tf.gfile.GFile("mylabel_path\\labels.txt")]

# Unpersists graph from file
with tf.gfile.FastGFile("mygraph_path\\graph.pb", 'rb') as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())
    _ = tf.import_graph_def(graph_def, name='')

with tf.Session() as sess:
    # Feed the image_data as input to the graph and get first prediction
    softmax_tensor = sess.graph.get_tensor_by_name('final_result:0')

    predictions = sess.run(softmax_tensor, \
             {'DecodeJpeg/contents:0': image_data})

    # Sort to show labels of first prediction in order of confidence
    top_k = predictions[0].argsort()[-len(predictions[0]):][::-1]

    for node_id in top_k:
        human_string = label_lines[node_id]
        score = predictions[0][node_id]
        print('%s (score = %.5f)' % (human_string, score))

In my local machine I execute de script with one parameter where I put the picture's path that I want to predict. My script works fine and I have the response in my final print. But I dont know how to execute this script in google cloud as a service. I already saw some foruns and I can't find a solution. For now, I turn this code in a webservice and I'm deploying in my own server. But I need to use the google cloud.

Any suggestion?

RMH
  • 222
  • 1
  • 11

2 Answers2

2

As Google cloud do you mean Cloud Machine Learning? This platform enables you to build machine learning models and is optimized for Tensorflow.

To deploy your model on this platform you need to save you graph in two files named export and export.meta (visit the link to more details).

In my view the only problem is how pass the image to predict to the service, because cloud ml don't accept jpg file, and you have to convert your image in a text format.

I think that you could find useful suggestions here and here.

Community
  • 1
  • 1
Davide Biraghi
  • 626
  • 1
  • 7
  • 17
1

The TensorFlow project provides Serving to turn models into services. The only protocol supported at this time is protocol buffers (gRPC) at this time.

The TensorFlow website and the Serving page provide sample code and tutorials.

Eric Platon
  • 9,819
  • 6
  • 41
  • 48
  • thanks Eric. I already tried those tutorials, but I don't know how to consume the service sending one sigle picture to predict (the base64 or the picture url). I'm trying a little bit more and I'll keep you informed! – RMH Feb 22 '17 at 12:54
  • The client code is a script called `tensorflow_serving/examples/inception_model.py`. You need to pass it through Bazel, though, to use it. It accepts a server host/port and an image file name as parameter. Everything is explained in the tutorials. – Eric Platon Feb 26 '17 at 14:46