I have a Keras graph with a float32 tensor of shape (?, 224, 224, 3) that I want to export to Tensorflow Serving, in order to make predictions with RESTful. Problem is that I cannot input tensors, but encoded b64 strings, as that is a limitation of the REST API. That means that when exporting the graph, the input needs to be a string that needs to be decoded.
How can I "inject" the new input to be converted to the old tensor, without retraining the graph itself? I have tried several examples [1][2].
I currently have the following code for exporting:
image = tf.placeholder(dtype=tf.string, shape=[None], name='source')
signature = predict_signature_def(inputs={'image_bytes': image},
outputs={'output': model.output})
I somehow need to find a way to convert image to model.input, or a way to get the model output to connect to image.
Any help would be greatly appreciated!