0

What I did is, I have deployed tensor-flow serving using Docker on Windows. I am using inception model inside the tensor-flow serving. It is up and running. Now, using java, I want to upload the image from browser to this inception model running in tensorflow serving and in response I should get the class name.

Any sample example would help.

AsthaUndefined
  • 1,111
  • 1
  • 11
  • 24
Nikhil
  • 101
  • 2
  • 13

2 Answers2

0

Assuming Tensorflow Serving 1.11.0-rc1 (it does not work in 1.10.1), the form of the request JSON payload is:

{
  "inputs": {
    "images": [{ "b64": "IMAGE_BASE64" }]
  }
}

Where IMAGE_BASE64 is the Base64 encoding on the image to predict on (typically a long string, so not shown here). The Java client library needs to build this payload for an input image.

Then, the Java client would submit the request to the following endpoint:

POST /v1/models/inception:predict

You may have to replace inception for the name of the model deployed on the server.


To try from a shell (either from a Linux running on Docker, or with an equivalent PowerShell command) against the server available on localhost (mapping to the port exposed by your Docker Serving container, here 8501):

curl -d "@request.json" -X POST http://localhost:8501/v1/models/inception:predict

The request.json file contains the request JSON payload at the beginning of this post. A typical response:

{
  "outputs": {
    "classes": [
      "class1",
      "class2",
      ...
    ],
    "scores": [
      [
        0.0321035,
        0.905796,
        ...
      ]
    ]
  }
}

Important note: The above runs come from a deployment of an old Inception v3 model (2015), exported at the time of Serving 0.4.1. It works very well with Serving 1.11.0-rc1, though. If you have exported your model with the latest export script, there might be small differences (the 1.11.0-rc1 export script does not seem to differ in terms of signature). So this answer may not work as is for you, but just put you on the way to solve the problem.

Eric Platon
  • 9,819
  • 6
  • 41
  • 48
  • This is exactly what I have been looking for. I am using C# for base64image string generation. Unfortunately, server get disconnected when POST with message "evhttp_read_cb: illegal connection state 7". When I look at PIL format in Keras its formatting is really different compared to B64 in C#. Any help ? – PCG Feb 14 '19 at 06:29
-1

tensorflow serving is a service, so treat as such. there is no need for anything special. since 1.8 tensorflow serving offers a REST API so simply send an http request from you java application to the REST tensorflow serving service.

There is a simple example for how to set up a REST tensorflow serving service for MNIST (but can be used with any model) in a blog post a recently posted: “Tensorflow serving: REST vs gRPC” https://medium.com/@avidaneran/tensorflow-serving-rest-vs-grpc-e8cef9d4ff62

All that is left for you to do is create the REST request according to your model (inception) signature.

z-star
  • 680
  • 5
  • 6
  • The person who asked the question might not be familiar with REST APIs. We should provide the question with an answer that answers his question. It's not good to just point to some external doc – amir Oct 31 '22 at 10:34