0

I have built and trained a TensorFlow model, which is deployed using the tf.Estimator paradigm. I have built a serving function like the one below:

def serving_input_fn(params):
    feature_placeholders = {
        'inputs' : tf.placeholder(tf.int64, [None], name='inputs')
    }
    features = {
        key: tensor
        for key, tensor in feature_placeholders.items()
    }
    return tf.estimator.export.ServingInputReceiver(features, feature_placeholders) 

Now, I want to be able to call it using application/json as content type. So I built a JSON file like the example I found in this question:

payload = {'instances': [{'inputs': [1039]}]}
json_string = json.dumps(payload)

When I invoke the model I get back:

ERROR in serving: Unsupported request data format: {u'instances': [{u'inputs': [1039]}]}.
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest

Any ideas how I can achieve my goal?

Dimitris Poulopoulos
  • 1,139
  • 2
  • 15
  • 36

1 Answers1

0

As it turns out the JSON should be:

request = {'dtype': 'DT_INT64', 
           'tensorShape': {'dim':[{'size': 1}]},
           'int64Val': [1039]}

json_string = json.dumps(request)
Dimitris Poulopoulos
  • 1,139
  • 2
  • 15
  • 36