3

I'm trying to save a model I created using Keras and saved as .h5 file but I get this Error Message everytime I try to run the freeze_session function: output_node/Identity is not in graph

This is my code (I'm using Tensorflow 2.1.0):

def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
    """
    Freezes the state of a session into a pruned computation graph.

    Creates a new computation graph where variable nodes are replaced by
    constants taking their current value in the session. The new graph will be
    pruned so subgraphs that are not necessary to compute the requested
    outputs are removed.
    @param session The TensorFlow session to be frozen.
    @param keep_var_names A list of variable names that should not be frozen,
                          or None to freeze all the variables in the graph.
    @param output_names Names of the relevant graph outputs.
    @param clear_devices Remove the device directives from the graph for better portability.
    @return The frozen graph definition.
    """
    graph = session.graph
    with graph.as_default():
        freeze_var_names = list(set(v.op.name for v in tf.compat.v1.global_variables()).difference(keep_var_names or []))
        output_names = output_names or []
        output_names += [v.op.name for v in tf.compat.v1.global_variables()]
        input_graph_def = graph.as_graph_def()
        if clear_devices:
            for node in input_graph_def.node:
                node.device = ""
        frozen_graph = tf.compat.v1.graph_util.convert_variables_to_constants(
            session, input_graph_def, output_names, freeze_var_names)
        return frozen_graph
model=kr.models.load_model("model.h5")
model.summary()
# inputs:
print('inputs: ', model.input.op.name)
# outputs: 
print('outputs: ', model.output.op.name)
#layers:
layer_names=[layer.name for layer in model.layers]
print(layer_names)

Which prints:

inputs: input_node outputs: output_node/Identity ['input_node', 'conv2d_6', 'max_pooling2d_6', 'conv2d_7', 'max_pooling2d_7', 'conv2d_8', 'max_pooling2d_8', 'flatten_2', 'dense_4', 'dense_5', 'output_node'] as expected (same layer names and outputs as in the model I saved after training it).

Then I try call the freeze_session function and save the resulting frozen graph:

frozen_graph = freeze_session(K.get_session(), output_names=[out.op.name for out in model.outputs])
write_graph(frozen_graph, './', 'graph.pbtxt', as_text=True)
write_graph(frozen_graph, './', 'graph.pb', as_text=False)

but I get this error:

AssertionError                            Traceback (most recent call last)
<ipython-input-4-1848000e99b7> in <module>
----> 1 frozen_graph = freeze_session(K.get_session(), output_names=[out.op.name for out in model.outputs])
      2 write_graph(frozen_graph, './', 'graph.pbtxt', as_text=True)
      3 write_graph(frozen_graph, './', 'graph.pb', as_text=False)

<ipython-input-2-3214992381a9> in freeze_session(session, keep_var_names, output_names, clear_devices)
     24                 node.device = ""
     25         frozen_graph = tf.compat.v1.graph_util.convert_variables_to_constants(
---> 26             session, input_graph_def, output_names, freeze_var_names)
     27         return frozen_graph

c:\users\marco\anaconda3\envs\tfv2\lib\site-packages\tensorflow_core\python\util\deprecation.py in new_func(*args, **kwargs)
    322               'in a future version' if date is None else ('after %s' % date),
    323               instructions)
--> 324       return func(*args, **kwargs)
    325     return tf_decorator.make_decorator(
    326         func, new_func, 'deprecated',

c:\users\marco\anaconda3\envs\tfv2\lib\site-packages\tensorflow_core\python\framework\graph_util_impl.py in convert_variables_to_constants(sess, input_graph_def, output_node_names, variable_names_whitelist, variable_names_blacklist)
    275   # This graph only includes the nodes needed to evaluate the output nodes, and
    276   # removes unneeded nodes like those involved in saving and assignment.
--> 277   inference_graph = extract_sub_graph(input_graph_def, output_node_names)
    278 
    279   # Identify the ops in the graph.

c:\users\marco\anaconda3\envs\tfv2\lib\site-packages\tensorflow_core\python\util\deprecation.py in new_func(*args, **kwargs)
    322               'in a future version' if date is None else ('after %s' % date),
    323               instructions)
--> 324       return func(*args, **kwargs)
    325     return tf_decorator.make_decorator(
    326         func, new_func, 'deprecated',

c:\users\marco\anaconda3\envs\tfv2\lib\site-packages\tensorflow_core\python\framework\graph_util_impl.py in extract_sub_graph(graph_def, dest_nodes)
    195   name_to_input_name, name_to_node, name_to_seq_num = _extract_graph_summary(
    196       graph_def)
--> 197   _assert_nodes_are_present(name_to_node, dest_nodes)
    198 
    199   nodes_to_keep = _bfs_for_reachable_nodes(dest_nodes, name_to_input_name)

c:\users\marco\anaconda3\envs\tfv2\lib\site-packages\tensorflow_core\python\framework\graph_util_impl.py in _assert_nodes_are_present(name_to_node, nodes)
    150   """Assert that nodes are present in the graph."""
    151   for d in nodes:
--> 152     assert d in name_to_node, "%s is not in graph" % d
    153 
    154 

**AssertionError: output_node/Identity is not in graph** 

I've tried but I don't really know how to fix this, so any help would be much appreciated.

  • Take a look at [this question and its answer](https://stackoverflow.com/questions/55562078/tensorflow-2-0-frozen-graph-support): in TF2 there is no support for freezing graphs. You'll need to export a SavedModel instead – GPhilo Jan 30 '20 at 09:39
  • On the other hand, for an (unsupported) method, have a look at [this blog post](https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/). Can't guarantee it works, nor it will keep working in the long term, though – GPhilo Jan 30 '20 at 09:42
  • @GPhilo thanks for the answer. I thought that calling the Tensorflow 1 functions would still work, but then I guess it doesn't. I need to get a frozen graph though, since I have to upload the model on a Google Vision Bonnet and the compiler requires a .pb file. So it's probably for the best to use directly TF1 at this point? Side Question: the SavedModel I get when using the model.save l function consists of a directory containing also a saved_model .fb file. Is it the same as the old (TF1) .pb frozen graphs? I assume it isn't, but I was wondering if there was a way to use that instead. – Marco Esposito Jan 30 '20 at 10:07
  • If you don't need something specific from TF2, probably using TF1 is easier for you, yes. As per the other question, no, they're different files. The savedmodel format requires the whole directory. "Pb" Is just a generic extension for binary protobuf messages, the message format changes between a SavedModel and the frozen graph – GPhilo Jan 30 '20 at 10:09
  • Ok, thanks again. I'll try porting the code and model to TF1 then (even if that was giving me some issues too) and freeze the graph this way. – Marco Esposito Jan 30 '20 at 11:37
  • 1
    @GPhilo switching completely to Tensorflow 1.15 helped me save the file correctly! The .pb file still cannot be parsed by the Vision Bonnet compiler though, but I'll start another thread about this other issue. – Marco Esposito Jan 30 '20 at 15:59

1 Answers1

7

If you use Tensorflow version 2.x add:

tf.compat.v1.disable_eager_execution()

This should work. I have not checked the resulting pb file, but it should work.

Feedback appreciated.

edit: However, following e.g, this thread, the TF1 and TF2 pb files are fundamentally different. My solution might not work properly or actually create an TF1 pb file.


If you then run into

RuntimeError: Attempted to use a closed Session.

This can be solved by restarting the kernel. You have only one shot using the line above.

Florida Man
  • 2,021
  • 3
  • 25
  • 43