2

I have a tensor such as this

Tensor("activation_1/Relu:0", shape=(?, 32, 32, 96), dtype=float32)

I want to serialize this tensor in order to move it across devices and then reconstruct it but it seems impossible to serialize such using Pickle or Dill

Update

The tensor is the result of this model block.

`def conv_module(x, K, kX, kY, stride, padding="same"):
    # define a CONV => BN => RELU pattern
    x = Conv2D(K, (kX, kY), strides=stride, padding=padding)(x)
    x = BatchNormalization()(x)
    x = Activation("relu")(x)
    # return the block  
    return x`

Currently, I am not using eager execution but I could do that as well if its an option.

ankahira
  • 87
  • 1
  • 11
  • Why would you want to serialize the activation output? – IanQ Feb 14 '19 at 21:46
  • Can you give some more details about what you are trying to do? Are you trying to serialize the value of the tensor for some specific input? Or the model that computes the tensor? Are you in graph mode or eager mode? – jdehesa Feb 15 '19 at 09:57
  • @jdehesa I have updated the question to include the information about the tensor. I am in graph mode but I can change into eager if that helps to break the execution flow. – ankahira Feb 15 '19 at 12:01
  • @ankahira Have a look at [A Tool Developer's Guide to TensorFlow Model Files](https://www.tensorflow.org/guide/extend/model_files)... I think you can use a [`tf.GraphDef`](https://www.tensorflow.org/api_docs/python/tf/GraphDef), unless you also want to save the trained weights, in which case you can either [freeze the model](https://stackoverflow.com/a/45466355) or use a [MetaGraph](https://www.tensorflow.org/api_guides/python/meta_graph) or a [saved model](https://www.tensorflow.org/guide/saved_model). – jdehesa Feb 15 '19 at 12:04
  • @jdehesa Thanks for that. A quick follow up, can I freeze part of a model and later unfreeze it in order to combine it with some other part. For instance, if my complete model is composed of several blocks like the one in the question. – ankahira Feb 15 '19 at 12:17
  • @ankahira Yes, you can do more or less anything you want, you just need to configure everything appropriately. For that case, if you prefer it you can also, instead of freezing one part, save a checkpoint and then restore it and continue optimizing only a subset of the variables. TensorFlow is, ultimately, quite flexible in that sense (I think), so you can do whatever suits you best. – jdehesa Feb 15 '19 at 12:21

0 Answers0