1

Is there any way to convert data-00000-of-00001 to Tensorflow Lite model? The file structure is like this

 |-semantic_model.data-00000-of-00001
 |-semantic_model.index
 |-semantic_model.meta
sumedhe
  • 934
  • 1
  • 13
  • 30
Tommy
  • 41
  • 4
  • 1
    What tensorflow version have u used to save model to these files? – Farmaker Jun 10 '20 at 09:25
  • I downloaded this ML data from this site (https://towardsdatascience.com/i-built-a-music-sheet-transcriber-heres-how-74708fe7c04c) – Tommy Jun 11 '20 at 02:34

1 Answers1

1

Using TensorFlow Version: 1.15

The following 2 steps will convert it to a .tflite model.

1. Generate a TensorFlow Model for Inference (a frozen graph .pb file) using the answer posted here

What you currently have is model checkpoint (a TensorFlow 1 model saved in 3 files: .data..., .meta and .index. This model can be further trained if needed). You need to convert this to a frozen graph (a TensorFlow 1 model saved in a single .pb file. This model cannot be trained further and is optimized for inference/prediction).

2. Generate a TensorFlow lite model ( .tflite file)

A. Initialize the TFLiteConverter: The .from_frozen_graph API can be defined this way and the attributes which can be added are here. To find the names of these arrays, visualize the .pb file in Netron

converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
    graph_def_file='....path/to/frozen_graph.pb', 
    input_arrays=...,
    output_arrays=....,
    input_shapes={'...' : [_, _,....]}
)

B. Optional: Perform the simplest optimization known as post-training dynamic range quantization. You can refer to the same document for other types of optimizations/quantization methods.

converter.optimizations = [tf.lite.Optimize.DEFAULT]

C. Convert it to a .tflite file and save it

tflite_model = converter.convert()

tflite_model_size = open('model.tflite', 'wb').write(tflite_model)
print('TFLite Model is %d bytes' % tflite_model_size)
Meghna Natraj
  • 673
  • 7
  • 15
  • Thank you for your answer. I'm stuck with Step 2. How can I convert frozen graph to tflite? – Tommy Jun 11 '20 at 08:14
  • I've updated the code above. You need to use `tf.compat.v1.lite.TFLiteConverter.from_frozen_graph` in TF 1.x and `tf.lite.TFLiteConverter.from_frozen_graph` in TF 2.x to load frozen graphs into the TFLiteConverter. Does this work for you? Could you post more details regarding the issue you're facing? – Meghna Natraj Jun 15 '20 at 18:38