0

I am loading the model as

def _load_model(model_filepath):
   model_exp = os.path.expanduser(model_filepath)
   if os.path.isfile(model_exp):
   print("loading model to graph")
   with gfile.FastGFile(model_exp, 'rb') as f:
      graph_def = tf.GraphDef()
      graph_def.ParseFromString(f.read())
      tf.import_graph_def(graph_def, name='')

and using this function in following code

tf.reset_default_graph()
with tf.Session(config=tf.ConfigProto(log_device_placement=False)) as sess:
    _load_model(model_filepath=model_path)
    test_set = _get_test_data(input_directory)
    images, labels = _load_images_and_labels(test_set, image_size=160, batch_size=batch_size,                                                               
    num_threads=num_threads, num_epochs=1)
    init_op = tf.group(tf.global_variables_initializer(), 
    tf.local_variables_initializer())   
    sess.run(init_op)
    images_placeholder = tf.get_default_graph().get_tensor_by_name("input:0")
    embedding_layer = tf.get_default_graph().get_tensor_by_name("embeddings:0")
    phase_train_placeholder = tf.get_default_graph().get_tensor_by_name("phase_train:0")

On each api call i am reseting the default graph and loading model which takes long time. I want to load my model only once and use it in session with new graph how can i achieve this?

chetan dev
  • 611
  • 2
  • 6
  • 16
  • 1
    Do you try to load your model with `tf.train.Saver()` and `restore` ? – Milan Feb 02 '18 at 09:24
  • Please have a look here: https://stackoverflow.com/questions/33759623/tensorflow-how-to-save-restore-a-model?rq=1 – Jonathan DEKHTIAR Feb 02 '18 at 09:29
  • Possible duplicate of [Tensorflow: how to save/restore a model?](https://stackoverflow.com/questions/33759623/tensorflow-how-to-save-restore-a-model) – Jonathan DEKHTIAR Feb 02 '18 at 09:29
  • I am resetting the default graph on each api call, because of which I am not able to preload my model. Is there a way in which I can use preloaded model in session with new graph – chetan dev Feb 02 '18 at 09:48
  • the model depends on the graph, if you change the graph you have to load the corresponding model. – Milan Feb 02 '18 at 15:21

1 Answers1

0

Usually you save and load models with tf.train.Saver(), see docs.

So after you train your model you do something like this:

saver.save(sess_name, "/path/model.ckpt")

and when you want to load ("restore") you do something like this:

saver = tf.train.Saver()
saver.restore(sess_name, "/path/model.ckpt")

As Jonathan DEKHTIAR has already mentioned, it makes sense to use the search before asking questions: Tensorflow: how to save/restore a model?

Milan
  • 256
  • 2
  • 17