0

I was using a model that includes a tf.keras.applications.MobileNetV2 and had problem with the frozen graph.

I found it is related with the BatchNormalization layer so wrote a simpler test program TESTE_PART1.py with just 1 layer. I used freeze_graph tool to create the .pb file. Then another python program TESTE_PART2.py to read the .pb using tf.import_graph_def.

The error is:

Traceback (most recent call last):
  File "/home/daniel/sistema/anaconda3/envs/tensorflow_1.13/lib/python3.6/site-packages/tensorflow/python/framework/importer.py", line 426, in import_graph_def
    graph._c_graph, serialized, options)  # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.InvalidArgumentError: Input 0 of node batch_normalization_v1/cond/ReadVariableOp/Switch was passed float from batch_normalization_v1/moving_mean:0 incompatible with expected resource.

During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/home/daniel/sistema/anaconda3/envs/tensorflow_1.13/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/daniel/sistema/anaconda3/envs/tensorflow_1.13/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/home/daniel/comp/armis/Plates/pysrc/cornerdetect/teste_part2.py", line 9, in <module>
    tf.import_graph_def(od_graph_def, name='')
  File "/home/daniel/sistema/anaconda3/envs/tensorflow_1.13/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func
    return func(*args, **kwargs)
  File "/home/daniel/sistema/anaconda3/envs/tensorflow_1.13/lib/python3.6/site-packages/tensorflow/python/framework/importer.py", line 430, in import_graph_def
    raise ValueError(str(e))
ValueError: Input 0 of node batch_normalization_v1/cond/ReadVariableOp/Switch was passed float from batch_normalization_v1/moving_mean:0 incompatible with expected resource.

I also tried to create the .pb in python with the freeze_session method as described here How to export Keras .h5 to tensorflow .pb? The result is the same.

The error complains about incompatible types. I modified the TESTE_PART1.py to print the dtypes of the 2 operations, both are "resource":

O1 INPUT Tensor("batch_normalization_v1/moving_mean:0", shape=(), dtype=resource)
O1 INPUT Tensor("batch_normalization_v1/cond/pred_id:0", shape=(), dtype=bool)
O2 OUTPUT Tensor("batch_normalization_v1/moving_mean:0", shape=(), dtype=resource)

I tested it with tf 1.12 and 1.13. The operating system is Linux 4.4.0-148-generic #174-Ubuntu SMP Tue May 7 12:20:14 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

TESTE_PART1.py

import tensorflow as tf

l1 = tf.keras.layers.BatchNormalization(input_shape=(100,))
model = tf.keras.models.Sequential([l1])

print("output name", model.output.op.name)
print("input name", model.input.op.name)
# output name batch_normalization_v1/batchnorm/add_1
# input name batch_normalization_v1_input

saver = tf.train.Saver()
sess = tf.keras.backend.get_session()
saver.save(sess, "teste.saver_export")

op1 = sess.graph.get_operation_by_name("batch_normalization_v1/cond/ReadVariableOp/Switch")
for input in op1.inputs:
    print("O1 INPUT", input)
op2 = sess.graph.get_operation_by_name("batch_normalization_v1/moving_mean")
for output in op2.outputs:
    print("O2 OUTPUT", output)

freeze_graph --input_meta_graph=teste.saver_export.meta --input_checkpoint=teste.saver_export --output_graph=teste.freeze_graph.pb --output_node_names="batch_normalization_v1/batchnorm/add_1" --input_binary=true

$ ls -l teste*
-rw-rw-r-- 1 daniel daniel  6990 jun  1 18:35 teste.freeze_graph.pb
-rw-rw-r-- 1 daniel daniel  1600 jun  1 18:34 teste.saver_export.data-00000-of-00001
-rw-rw-r-- 1 daniel daniel   239 jun  1 18:34 teste.saver_export.index
-rw-rw-r-- 1 daniel daniel 28006 jun  1 18:34 teste.saver_export.meta

TESTE_PART2.py

import tensorflow as tf

detection_graph = tf.Graph()
with detection_graph.as_default():
    od_graph_def = tf.GraphDef()
    with open("teste.freeze_graph.pb", 'rb') as fd:
        serialized_graph = fd.read()
        od_graph_def.ParseFromString(serialized_graph)
        tf.import_graph_def(od_graph_def, name='')  # here
Daniel
  • 147
  • 1
  • 10
  • Do you think the problem is related to this? https://github.com/keras-team/keras/pull/11847 – dcolazin Jun 04 '19 at 13:31
  • Doesn't look like. – Daniel Jun 05 '19 at 10:20
  • Anyone tried the code? it is just a few lines. I tried it in two different computers so I think it is not about my tensorflow instalation. Maybe no one is using frozen pb models anymore? I can try to convert the model directly to tflite. – Daniel Jun 05 '19 at 10:25
  • Here is the code in colaboratory: https://colab.research.google.com/drive/1kh9lxgXK9BTiFItOmhnDTwmhj_mbzC4A – Daniel Jun 05 '19 at 11:08

0 Answers0