2

I was using a code from OpenAI that was supposed to use the set_shape() function to turn a graph working with sized 1 batch to any size batch. This was supposedly working with TF version <1.4 but is not working anymore see this issue. Here is the code:

  with tf.gfile.FastGFile(os.path.join(
      MODEL_DIR, 'classify_image_graph_def.pb'), 'rb') as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())
    _ = tf.import_graph_def(graph_def, name='')
  # Works with an arbitrary minibatch size.
  with tf.Session() as sess:
    pool3 = sess.graph.get_tensor_by_name('pool_3:0')
    ops = pool3.graph.get_operations()
    for op_idx, op in enumerate(ops):
        for o in op.outputs:
            shape = o.get_shape()
            shape = [s.value for s in shape]
            new_shape = []
            for j, s in enumerate(shape):
                if s == 1 and j == 0:
                    new_shape.append(None)
                else:
                    new_shape.append(s)
            o.set_shape(tf.TensorShape(new_shape))

This seems to be intended behavior as @mrry said in this answer because set_shape() is supposed to improve informations about shape not doing the opposite. So how would I go about changing a graph adapted to batch size 1 to batch size None in a pretty tensorflowic way (general) ?( Without having to collect all the weights and redefine each operation by hand)

jeandut
  • 2,471
  • 4
  • 29
  • 56

0 Answers0