I want to make online object detection predictions (or inference) from an already existing model in Google ML engine. But i am not able to build the json request.
The model is faster_rcnn_inception_resnet_v2_atrous_coco_2017_11_08 from the TF model zoo. The inputs are images the outputs class, bb, score etc...
The needed inputs are (from saved_model_cli show )
inputs['inputs'] tensor_info:
dtype: DT_UINT8
shape: (-1, -1, -1, 3)
name: image_tensor:0
As it expects an uint8 array I load the image into a numpy array
encoded_contents = np.array(image.getdata()).reshape(
(im_height, im_width, 3)).astype(np.uint8)
Resize the image image_np_expanded = np.expand_dims(encoded_contents, axis=0)
Tried to build the json request
instance = {"input":encoded_contents}
row = json.dumps(instance,sort_keys=True)
But I am not able to build it because
TypeError(repr(o) + " is not JSON serializable")
TypeError: array([[[164, 191, 220],
[190, 157, 114],
[190, 157, 114]]], dtype=uint8) is not JSON serializable
If I convert the numpy array to a list with the tolist() method the json file takes 3 megas and the ML-engine refuses it "message": "Request payload size exceeds the limit: 1572864 bytes.",
I will send this json to ml-engine predict as a json file.
gcloud ml-engine predict --model=pellaires --version=pellaires14 --json-
instances=request.json > response.yaml