7

i tried to follow this tutorial on how to convert a Keras H5 Model zu ProtoBuff and serving it using Tensorflow Serve: https://towardsdatascience.com/deploying-keras-models-using-tensorflow-serving-and-flask-508ba00f1037

That tutorial among many other resources on the web use "tf.saved_model.simple_save", which is deprecated and removed by now (March 2019). Converting the h5 into pb using freeze_session as shown here: How to export Keras .h5 to tensorflow .pb?

Seems to miss a "serve" Tag, as the tensorflow_model_server outputs:

Loading servable: {name: ImageClassifier version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli

checked it with saved_model_cli, there are no tags.

What is the way to make a h5 model serveable in tensorflow_server nowadays?

Jens Caasen
  • 597
  • 1
  • 6
  • 19

1 Answers1

14

NOTE: This applies to TF 2.0+

I'm assuming you have your Keras model in model.h5. Firstly, just load the model with tensorflow's implementation of Keras:

from tensorflow import keras
model = keras.models.load_model('model.h5')

Then, simply export a SavedModel

keras.experimental.export_saved_model(model, 'path_to_saved_model')

Finally, apply any transformation you nomally d to go from SavedModel to the .pb inference file (e.g.: freezing, optimizing for inference, etc)

You can hve more details and a full example in TF's official guide for saving and serializing models in TF 2.0

GPhilo
  • 18,519
  • 9
  • 63
  • 89
  • 1
    Thank you for your Reply. Do i have to have a specific version of keras? i installed keras using "pip3 install keras". When using your code, it exists with the error "AttributeError: module 'tensorflow.python.keras' has no attribute 'experimental'" – Jens Caasen Mar 22 '19 at 16:57
  • That's what I meant by "tensorflow's implementation of Keras". Don't `import keras`, the `from tensorflow import keras` part is very important – GPhilo Mar 22 '19 at 17:10
  • I did not import anything else, plain what you wrote. But now i noticed: on my local machine he can not find "experimental", and in a google collab VM he does find "experimental", but not "export_saved_model". So i guess i have a problem with the versions somewhere. Thank you for your input so far, will try to solve that first somehow – Jens Caasen Mar 22 '19 at 17:50
  • What TF version do you have installed? – GPhilo Mar 22 '19 at 17:55
  • Locally 1.5.0, and google Collab even runs 1.13.0. So TF 2.0 just recently released and i need to update more often. Will figure out how to use TF2 with my CPU (had to downgrade the TF Version once because my CPU does not support a Commandset newer TF Versions are compiled with) and then try again. Thanks so far! – Jens Caasen Mar 23 '19 at 08:29
  • Yes, TF 1.5 is definitely not supporting this. Version 1.13 might, under different names. I do refer to the alpha version of 2.0 because you asked how this is done "in 2019". TF2 will be out in a matter of months and that will be the "new" way to do this. – GPhilo Mar 23 '19 at 08:41
  • Why does 1.3 support it and 1.5 not., isnt 1.5 newer than 1.3? So i will try to find a documentation on 1.3 and the keras.experimental then to find out the right name for the "save" method – Jens Caasen Mar 23 '19 at 15:05
  • It's 1.13 (thirteen), not 1.3 (Amended my previous incorrect comment that wrongly stated 1.13 would be the last TF1 release before TF2. The last v1 stable release is TF 1.15) – GPhilo Oct 21 '19 at 14:45
  • 4
    API is updated to `model.save('path_to_saved_model', save_format="tf")` now – arao6 Dec 26 '19 at 08:05