0

I have trained a custom neural network with the function:

tf.estimator.train_and_evaluate

After correct training, it contains the following files:

  • checkpoint
  • events.out.tfevents.1538489166.ti
  • model.ckpt-0.data-00000-of-00002
  • model.ckpt-0.index
  • model.ckpt-10.data-00000-of-00002
  • model.ckpt-10.index eval
  • graph.pbtxt
  • model.ckpt-0.data-00001-of-00002
  • model.ckpt-0.meta
  • model.ckpt-10.data-00001-of-00002
  • model.ckpt-10.meta

Now I need to export the weights and biases of every layer, into a raw data structure, e.g. an array, numpy.

I have read multiple pages on TensorFlow, and on other topics, but neither can find this question. The first thing I would assume to put the fils together into graph.pd with the freeze.py as suggested here:

Tensorflow: How to convert .meta, .data and .index model files into one graph.pb file

But then still the main question is unsolved.

talonmies
  • 70,661
  • 34
  • 192
  • 269
michelvl92
  • 61
  • 6

2 Answers2

0

If you wish to evaluate tensors alone, you can check out this question. But if you wish to e.g. deploy your network, you can take a look at TensorFlow serving, which is probably the most performant one right now. Or if you want to export this network to other frameworks and use them there, you can actually use ONNX for this purpose.

Kevin He
  • 1,210
  • 8
  • 19
0

If saving weights and biases in a numpy array is your strict requirement, you can follow this example:

# In a TF shell, define all requirements and call the model function 
y = model(x, is_training=False, reuse=tf.AUTO_REUSE) # For example

Once you call this function, you can see all the variables in the graph by running

tf.global_variables()

You need to restore all these variables from the latest checkpoint (say ckpt_dir) and then execute each of these variables to get the latest values.

checkpoint = tf.train.latest_checkpoint('./model_dir/')
fine_tune = tf.contrib.slim.assign_from_checkpoint_fn(checkpoint,
                                                      tf.global_variables(),
                                                      ignore_missing_vars=True)


sess = tf.Session()
sess.run(tf.global_variables_initializer())
gv = sess.run(tf.global_variables())

Now gv will be a list of all the values of your variables (weights and biases); You can access any individual component via indexing - gv[5] etc. Or you can convert the entire thing into an array and save using numpy.

np.save('my_weights', np.array(gv))

This will save all your weights and biases in your current working directory as a numpy array - 'my_weights.npy'.

Hope this helps.

End-2-End
  • 921
  • 8
  • 16
  • Hi, thx for the answer, this partly helps, as the following happens; tf.global_variables() stays empty, which is directly called after tf.estimator.train_and_evaluate. This is probably because of not correctly call the model function y = model(), as im not sure what you mean with that. Furthermore I want to not only save the weights into a numpy array, separate them from each layer. – michelvl92 Oct 02 '18 at 22:33
  • Okay. May I ask what exactly do you plan to achieve? And why you can't use checkpoints? Any details on how you're going to use these might also help. – End-2-End Oct 03 '18 at 23:13