0

I am utilising tensorflow v 1.12 to run some old machine learning code. I am trying to separate the training and the testing process into different files. After training the model for a certain number of iterations, i call the

train.py

restorer = tf.train.import_meta_graph(save_path + '.meta')
restorer.restore(sess, save_path)

In the test file I load the model as follows:

test.py

with tf.Session() as sess:
    restorer = tf.train.import_meta_graph(save_path + '.meta')
    restorer.restore(sess, save_path = save_path)
    env_model = EnvModel() # create model used in do_eval function 
    est, actual, error = do_eval(test_df)

and in the do_eval function, there is a

est_next_state,loss = sess.run([env_model.est_next_state,env_model.loss],
feed_dict={env_model.cur_state:states,
                   env_model.next_state:next_states,
                   env_model.actions:actions,
                   env_model.done_flags:done_flags,
                   env_model.phase:False})

I get an error tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value BatchNorm_1/moving_mean_5

It seems that in tensorflow v1, I am only able to load the tensorflow model within the training script itself ?

calveeen
  • 621
  • 2
  • 10
  • 28
  • When I look at the [documentation of `import_meta_graph`](https://www.tensorflow.org/api_docs/python/tf/compat/v1/train/import_meta_graph) there is an `add_to_collection` and `get_collection` instruction which apparently allows to save/load the train op. Why don't you use something similar? – LucG Apr 19 '20 at 07:35
  • Does this answer your question? https://stackoverflow.com/questions/33759623/tensorflow-how-to-save-restore-a-model – LucG Apr 19 '20 at 07:40
  • I did something similar to what was proposed in the link. But im getting an error with the model – calveeen Apr 19 '20 at 07:45

0 Answers0