I'd like to test inference on a TensorFlow Lite model I've loaded into an Android project.
I have some inputs generated in a Python environment I'd like to save to a file, load into my Android app and use for TFLite inference. My inputs are somewhat large, one example is:
<class 'numpy.ndarray'>, dtype: float32, shape: (1, 596, 80)
I need some way of serialising this ndarray and loading this into Android.
More information on TFLite inference can be found here. In essence, this should be a multi-dimensional array of primitive floats, or a ByteBuffer.
What is the most simple way to:
- Serialise this ndarray on the Python side
- Deserialise this blob on the Java side from a file
Thanks!