4

I want to save a model in python using the SaveModelBuilder API and then load it in C++.

I was able to successfully implement the tensorflow label_image example, and load the downloaded model in c++. Although when I try to do the same with my own custom model it does not work. I then implemented Tom's solution to export my .pb file. Here I tried to load the model as shown in label_image example and also using Meta Graph, but it did not work.

My code to load Graph Def is

 tensorflow::GraphDef graph_def;
  Status load_graph_status =  ReadBinaryProto(tensorflow::Env::Default(), graph_file_name, &graph_def);
  session->reset(tensorflow::NewSession(tensorflow::SessionOptions()));
  Status session_create_status = (*session)->Create(graph_def);

and to load Meta Graph Def is

 tensorflow::MetaGraphDef graph_def;
  Status load_graph_status =  ReadBinaryProto(tensorflow::Env::Default(), graph_file_name, &graph_def);
  session->reset(tensorflow::NewSession(tensorflow::SessionOptions()));
  Status session_create_status = (*session)->Create(graph_def.graph_def());

I saved the model in python as per Tom's article as:

prediction_signature = ( tf.saved_model.signature_def_utils.build_signature_def(
            inputs = {"inputs": model_input}, outputs = {"outputs":model_output}, method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))            
builder.add_meta_graph_and_variables(sess, [tf.saved_model.tag_constants.SERVING], signature_def_map= { tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:prediction_signature },)
builder.save()

When I run the code. I just get the error

 Failed to load compute graph

when I have defined the graph as a normal graph, although if I define it as MetaGraph, I get the following error:

[libprotobuf ERROR external/protobuf_archive/src/google/protobuf/wire_format_lite.cc:611] String field 'tensorflow.NodeDef.op' contains invalid UTF-8 data when parsing a protocol buffer. Use the 'bytes' type if you intend to send raw bytes.

Note that the I am able to run the label_image example, so there no other issues like incorrect path to model or general syntax issues.

I want helping understanding and using:

SavedModelBundle bundle;
...
LoadSavedModel(session_options, run_options, export_dir, {kSavedModelTagTrain},
               &bundle);

as instructed on the Save & Restore Documentation provided by tensorflow.

To conclude I want a way to save tensorflow model in python using SaveModelBuilder and load it in c++.

0 Answers0