Is there any way to convert data-00000-of-00001
to Tensorflow Lite model?
The file structure is like this
|-semantic_model.data-00000-of-00001
|-semantic_model.index
|-semantic_model.meta
Is there any way to convert data-00000-of-00001
to Tensorflow Lite model?
The file structure is like this
|-semantic_model.data-00000-of-00001
|-semantic_model.index
|-semantic_model.meta
Using TensorFlow Version: 1.15
The following 2 steps will convert it to a .tflite
model.
1. Generate a TensorFlow Model for Inference (a frozen graph .pb
file) using the answer posted here
What you currently have is model checkpoint
(a TensorFlow 1 model saved in 3 files: .data..., .meta and .index. This model can be further trained if needed). You need to convert this to a frozen graph
(a TensorFlow 1 model saved in a single .pb
file. This model cannot be trained further and is optimized for inference/prediction).
2. Generate a TensorFlow lite model ( .tflite
file)
A. Initialize the TFLiteConverter: The .from_frozen_graph
API can be defined this way and the attributes which can be added are here. To find the names of these arrays, visualize the .pb
file in Netron
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
graph_def_file='....path/to/frozen_graph.pb',
input_arrays=...,
output_arrays=....,
input_shapes={'...' : [_, _,....]}
)
B. Optional: Perform the simplest optimization known as post-training dynamic range quantization. You can refer to the same document for other types of optimizations/quantization methods.
converter.optimizations = [tf.lite.Optimize.DEFAULT]
C. Convert it to a .tflite
file and save it
tflite_model = converter.convert()
tflite_model_size = open('model.tflite', 'wb').write(tflite_model)
print('TFLite Model is %d bytes' % tflite_model_size)