if you use estimator to biuld a model, you can using tf.estimator.Estimator.export_saved_model to freeze your model.
model = tf.estimator.Estimator(
model_fn=model_fn,
model_dir=model_saved_dir)
def serving_input_receiver_fn():
# in here, my input is 512 x 512 single channel image
feature = tf.compat.v1.placeholder(tf.float32, shape=[None, 512, 512, 1], name="inputs")
return tf.estimator.export.TensorServingInputReceiver(feature, feature)
model.export_saved_model(model_saved_dir, serving_input_receiver_fn)
this code is work in tensorflow 2.0
or you use keras, You can refer to the steps of the official website
https://www.tensorflow.org/tutorials/keras/save_and_load#savedmodel_format