2

I want to know how to make changes to a graph loaded from tensorflow's meta and checkpoint files like:

saver = tf.train.import_meta_graph('***.meta') saver.restore(sess,tf.train.latest_checkpoint('./'))

For example, there are old_layer1 -> old_layer2 in existing graph with pretrained weights. I want to insert one then it becomes old_layer1 -> new_layer -> old_layer2, and new_layer are randomly initialized since there are no pretrained parameter for it. Answer here said its impossible, since tf's graph only allow append, is this true?

So I wonder if this can be worked around by loading the pretrained layers as individual variables, and assigning pre-trained weights as initial values and connect them by myself, so that I can add new layers between old ones. But I don't know how to do this in code.

Newb
  • 111
  • 8

1 Answers1

1

Doing this with raw tensorflow can be complicated since the tf graph does not encode directly the structure of the layers. If your model was built with tf.keras, however, this is fairly straightforward as loading a keras model also loads its layer structure.

Alexandre Passos
  • 5,186
  • 1
  • 14
  • 19