1

I'm trying to convert the bert serving model to its equivalent tflite format.

The snippet that is used to convert to tflite format is:

import os
import tensorflow as tf

cur_dir = os.getcwd()

saved_model_dir = os.path.join(cur_dir, 'serving_dir')

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)

converter.target_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]#,
                        #tf.lite.OpsSet.TFLITE_BUILTINS]

tflite_model = converter.convert()

open(os.path.join(cur_dir, 'cnv1.tflite'), 'wb').write(tflite_model)

If the input shape is None, I'm getting the following error:

Traceback (most recent call last):
  File "temp.py", line 13, in <module>
    tflite_model = converter.convert()
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/lite/python/lite.py", line 406, in convert
    "'{0}'.".format(_tensor_name(tensor)))
ValueError: Provide an input shape for input array 'input_example_tensor'.

Batch size is already specified in the code. If I give shape as (10,), It's able to convert to it's tflite equivalent. To do inference on the generated model (tflite), the snippet from here has been used. But interpreter.allocate_tensors() is throwing the following:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/lite/python/interpreter.py", line 73, in allocate_tensors
    return self._interpreter.AllocateTensors()
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper
.py", line 106, in AllocateTensors
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Regular TensorFlow ops are not supported by this interpreter. Make sure you invoke the Flex delegate before inference.Node number 0 (Fle
x) failed to prepare.

Someone has reported the error on github. To resolve that, jdduke has commented:

We're hoping to have that resolved for the 1.14 release.

I've tried 1.14.0rc0 to get the following:

Traceback (most recent call last):
  File "temp.py", line 8, in <module>
    converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/lite/python/lite.py", line 696, in from_saved_model
    output_arrays, tag_set, signature_key)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/lite/python/convert_saved_model.py", line 205, in freeze_saved_mode
l
    frozen_graph_def = util.freeze_graph(sess, in_tensors, out_tensors)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/lite/python/util.py", line 238, in freeze_graph
    output_arrays)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/python/framework/graph_util_impl.py", line 270, in convert_variable
s_to_constants
    inference_graph = extract_sub_graph(input_graph_def, output_node_names)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/python/framework/graph_util_impl.py", line 182, in extract_sub_grap
h
    _assert_nodes_are_present(name_to_node, dest_nodes)
  File "/media/data2/anaconda3/envs/tf-lite/lib/python3.6/site-packages/tensorflow/python/framework/graph_util_impl.py", line 137, in _assert_nodes_ar
e_present
    assert d in name_to_node, "%s is not in graph" % d
AssertionError: ParseExample/ParseExample:3 is not in graph
Abhisek
  • 4,610
  • 3
  • 17
  • 27

0 Answers0