0
  • I trained an image classifier following this [tutorial from tensorflow].(https://www.tensorflow.org/hub/tutorials/image_retraining)

  • I used this snippet to generate my SavedModel after the training process.

  • I followed the instructions from Google to deploy my model and I tried to make some predictions with an image from my local directory.

  • To perform the prediction I used this:

# Create request message in json format

python -c 'import base64, json; img =
base64.b64encode(open("image.jpg").read()); print
json.dumps({"image_bytes": {"b64": img}})' image.jpg &> request.json

# Call prediction service API to get classifications
gcloud ml-engine predict --model ${MODEL_NAME} --json-instances
request.json

And I got the following error :

"error": "Prediction failed: Error processing input: Expected float32, got {u'b64': u'/9j/4AA....lPqevnQf//Z'} of type 'dict' instead.

Should I retrain the model using a different type or how I can solve this problem ? Any tips is much appreciated.

Community
  • 1
  • 1
user 007
  • 821
  • 10
  • 31
  • nit: Command has changed gcloud ai-platform. – gogasca Dec 09 '19 at 19:22
  • Does this answer your question? [GCP ML Engine Prediction failed: Error processing input: Expected float32 got base64](https://stackoverflow.com/questions/54242029/gcp-ml-engine-prediction-failed-error-processing-input-expected-float32-got-ba) – gogasca Dec 10 '19 at 22:36

2 Answers2

1

You need to make sure your serving function is written like below. Note that the name of input is image_bytes and can be anything that ends in _bytes.

    def serving_input_fn():
       feature_placeholders = {
          'image_bytes': tf.placeholder(dtype=tf.string, shape=[None], name='source')}
          single_image = tf.decode_raw(feature_placeholders['image_bytes'], tf.float32)
          return tf.estimator.export.ServingInputReceiver(feature_placeholders, feature_placeholders)

In order to learn more details around how to send the data and the rationale around it, please checkout https://stackoverflow.com/a/49177909/6031363

Additionally, you can visit AI Platform docs instructions on sending prediction requests https://cloud.google.com/ml-engine/docs/prediction-overview

Puneith Kaul
  • 364
  • 2
  • 4
0

I run the tutorial, the Python code which you use:

python -c 'import base64, json; img = base64.b64encode(open("image.jpg").read()); print json.dumps({"image_bytes": {"b64": img}})' image.jpg &> request.json

Generates a file with the following contents:

{"image_bytes": {"b64": "Base64Text..."}}

Train your model using export saved_model_dir option.

$ python retrain.py --image_dir ~/flower_photos --
 saved_model_dir=/tmp/saved_models/$(date +%s)

Use SavedModel CLI to show the signatures of your SavedModel. Enter the following command to show the signature of inputs/outputs of a TensorFlow SavedModel:

$ saved_model_cli show --dir /tmp/saved_models/1575937119 --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['image'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 299, 299, 3)
        name: Placeholder:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['prediction'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 5)
        name: final_result:0
  Method name is: tensorflow/serving/predict

This means that the model is expecting an input using Tensors, not a b64 encoded image.

You can start a TensorFlow serving to test locally:

tensorflow_model_server --model_base_path=/tmp/saved_models/  --rest_api_port
=9001

URL=http://localhost:9001/v1/models/default:predict
curl -X POST -d @out.json $URL

Where out.json is a file in JSON format you will get the expected result. With TF Serving you can use the following code to generate the file:

import numpy as np
import json
from PIL import Image

INPUT_FILE = 'image.jpg'
OUTPUT_FILE = '/tmp/out.json'

def convert_to_json(image_file):
 """Open image, convert it to numpy and create JSON request"""
 img = Image.open(image_file).resize((224, 224))
 img_array = np.array(img)
 predict_request = {"instances": [img_array.tolist()]}
 with open(OUTPUT_FILE, 'w') as output_file:
   json.dump(predict_request, output_file)
 return predict_request

prediction_data = convert_to_json(INPUT_FILE)

You will get:

{ 
 "predictions": [[0.0, 0.0, 1.0, 0.0, 0.0]] 
}

If you use AI Platform, you can just send the request using gcloud ai-platform predict or as an example the UI for testing:

enter image description here

Check: How convert a jpeg image into json file in Google machine learning for details.

As mentioned by @Puneith, you need to change the Serving function to handle b64.

This question is similar to GCP ML Engine Prediction failed: Error processing input: Expected float32 got base64

gogasca
  • 9,283
  • 6
  • 80
  • 125