1

The motivation behind this question is I had saved a Keras model using Matterport's MaskRCNN and in the tf.keras.callbacks.ModelCheckpoint() had very explicitly set the save_weights_only argument to False, so that the entire model would be saved (not just the weights).

Turns out there's a bug in the ModelCheckpoint() callback where it sometimes does not save the full model.

This is obviously a problem when you go to load the model after closing your TF session, as the Graph, architecture, and optimizer state are gone, making it hard (if not impossible) to reload that saved model.

Therefore, I am asking whether it is possible to somehow extract the TF session retroactively, from just the .h5 weights file, after the session has closed (resulting from, for example, your Notebook kernel crashing).

Not much code to go on, but there it is:

Given a .h5 file that was saved after each epoch of training a model in Keras, is it possible to extract the Graph session from that .h5 file, and if so, how?

I have several models saved in .h5 format but never called tf.get_session() during the saving of the model weights in h5 format.

with tf.session() as sess:

how to load this model using Tensorflow

TF 2.0 makes this a cinch, but how to solve this on Tensorflow version 1.14?

The end goal of this is to take a model saved with Keras as a .h5 file and do inference with it on Tensorflow Serving, which needs, to my knowledge, a protobuf file in .pb format.

https://medium.com/@pipidog/how-to-convert-your-keras-models-to-tensorflow-e471400b886a

I've tried keras_to_tensorflow: https://github.com/amir-abdi/keras_to_tensorflow

JohnnyUtah
  • 350
  • 4
  • 11

1 Answers1

1

The code to convert ModelCheckPoint saved in .h5 format to .pb format is shown below:

import tensorflow as tf

# The export path contains the name and the version of the model
tf.keras.backend.set_learning_phase(0) # Ignore dropout at inference
model = tf.keras.models.load_model('./model.h5')
export_path = './PlanetModel/1'

# Fetch the Keras session and save the model
# The signature definition is defined by the input and output tensors
# And stored with the default serving key
with tf.keras.backend.get_session() as sess:
    tf.saved_model.simple_save(
        sess,
        export_path,
        inputs={'input_image': model.input},
        outputs={t.name:t for t in model.outputs})

For more information, please refer this article.

For other ways to do it, please refer this Stack Overflow Answer.