1

I have a frozen inference graph(frozen_inference_graph.pb) and a checkpoint (model.ckpt.data-00000-of-00001, model.ckpt.index), how to deploy these to Tensorflow serving? serving need SavedModel format, how to convert to it? I study Tensorflow and found Deeplab v3+ provide PASCAL VOC 2012 model, I run train, eval, visualization on my local PC, but I don't know how to deploy it on serving.

Amir
  • 16,067
  • 10
  • 80
  • 119
candrwow
  • 511
  • 5
  • 21

1 Answers1

0

Have you tried export_inference_graph.py?

Prepares an object detection tensorflow graph for inference using model configuration and a trained checkpoint. Outputs inference graph, associated checkpoint files, a frozen inference graph and a SavedModel

ma.mehralian
  • 1,274
  • 2
  • 13
  • 29
  • The savedmodel is obtained by referring to the solution under this link:https://stackoverflow.com/questions/44329185/convert-a-graph-proto-pb-pbtxt-to-a-savedmodel-for-use-in-tensorflow-serving-o – candrwow Jan 03 '19 at 11:31