1

I want to train a model in tensorflow, then export the trained weights (and biases), and use them to resume the training in a new graph with a different smaller learning rate. I do not need to save or use the architecture of the previously-trained model ( I will define the model architecture again). All I need is to initialize the new weights to be equal to the trained weights. Is there a straightforward way to do this? Thanks.

In an attempt to restore and reuse the weights, I have used the following code, but it wasn't successful:

tf.reset_default_graph()
sess = tf.Session()
new_saver = tf.train.import_meta_graph("model20256.meta") 
new_saver.restore(sess, tf.train.latest_checkpoint('./'))
graph = tf.get_default_graph()

#Import trained weights as tensors
W1__ = graph.get_tensor_by_name('W1:0')
b1__ = graph.get_tensor_by_name('b1:0')
W2__ = graph.get_tensor_by_name('W2:0')
b2__ = graph.get_tensor_by_name('b2:0')

#Create new weight variables initialized by the trained weights    
W1_=tf.Variable(W1__)
W2_=tf.Variable(W2__)
b1_=tf.Variable(b1__)
b2_=tf.Variable(b2__)


sess_ = tf.Session()
sess_.run(tf.global_variables_initializer())

#Define a new graph
g = tf.Graph()
with g.as_default():
    #Copy variables to the new graph
    W1=tf.contrib.copy_graph.copy_variable_to_graph(W1_,g)
    W2=tf.contrib.copy_graph.copy_variable_to_graph(W2_,g)
    b1=tf.contrib.copy_graph.copy_variable_to_graph(b1_,g)
    b2=tf.contrib.copy_graph.copy_variable_to_graph(b2_,g)
Mohammad Amin
  • 444
  • 1
  • 5
  • 14

1 Answers1

0

Inside your session, replace sess_.run(tf.global_variables_initializer()) with new_saver.restore(sess, tf.train.latest_checkpoint('./'))

midhun pk
  • 11
  • 1