0

I'm working on Tensorlfow 2.0 to train a model. I want to reduce trained model size by converting to Tensorflow Lite model. But when I load TF Lite model, it will throw the error below at code line tf.saved_model.load():

TypeError: '_UserObject' object is not callable

I have done some searching and see that I need to use tf.keras.models.load_model() instead of tf.saved_model.load() to run the TF Lite model. But the searching answer is not clearly and I can not refer to my situation.

Question: Is there any way to run TF Lite model with tf.saved_model.load()? Or I need to convert all TF code to TF Keras code?

Hoang97
  • 47
  • 3
  • tf.saved_model.load() is not intended for loading tflite models. tflite models are for devices with limited computing power, and you should load them with a different API. – Kaveh Sep 08 '21 at 18:48
  • 1
    You can use [tfliteinterpreter](https://www.tensorflow.org/api_docs/python/tf/lite/Interpreter) to load tflite models. – Kaveh Sep 08 '21 at 18:56
  • @Kaveh Thank you for fast reply. I'm planning to run TF Lite model on Jetson Nano kit. Is it possible to apply "tfliteinterpreter" for Jetson Nano kit? – Hoang97 Sep 09 '21 at 05:26
  • @Hoang97 You can run the TFLite interpreter on various platforms. If it's [Linux](https://www.tensorflow.org/lite/guide/inference#linux_platform), use [C++](https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_c) or [Python API](https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_python) – Meghna Natraj Sep 13 '21 at 23:19
  • Hi! You can load tflite models like this . interpreter = tf.lite.Interpreter(model_path="converted_model.tflite") reference https://stackoverflow.com/questions/50443411/how-to-load-a-tflite-model-in-script –  Oct 10 '21 at 13:48

0 Answers0