3

I'd like to test inference on a TensorFlow Lite model I've loaded into an Android project.

I have some inputs generated in a Python environment I'd like to save to a file, load into my Android app and use for TFLite inference. My inputs are somewhat large, one example is:

<class 'numpy.ndarray'>, dtype: float32, shape: (1, 596, 80)

I need some way of serialising this ndarray and loading this into Android.

More information on TFLite inference can be found here. In essence, this should be a multi-dimensional array of primitive floats, or a ByteBuffer.

What is the most simple way to:

  • Serialise this ndarray on the Python side
  • Deserialise this blob on the Java side from a file

Thanks!

OscarVanL
  • 701
  • 1
  • 6
  • 21
  • I'm leaning towards flattening the 2D array into a 1D array, then using protocol buffers to serialise it, then recreating the array shape on the Java side, but this seems a little overkill. – OscarVanL Dec 08 '20 at 17:20
  • What is your purpose? You want to test java-side code including inference or just tflite model? – Alex K. Dec 08 '20 at 19:47
  • Essentially my problem is that I am trying to debug weird behaviour where my TFLite model has much worse accuracy on Android compared to on Windows. For this reason, I'd like to do inference on the TFLite model using exactly the same inputs on both Windows and Android to see if the results match (as they should). – OscarVanL Dec 09 '20 at 00:26
  • Have not tried but looks legit: [CSV](https://stackoverflow.com/a/43055945/14161847) file or plain text. If you do not want to mess with android filesystem put your data into assets. Another option is make 2 equal data generators with same output: that can be handy for testing preprocessing – Alex K. Dec 09 '20 at 08:07

1 Answers1

2

I figured this out in the end, there's a handy Java library called JavaNpy that allows you to open .npy files in Java, and therefore Android.

On the Python side I saved a flattened .npy in the normal way:

data_flat = data.flatten()
print(data_flat.shape)
np.save(file="data.npy", arr=data_flat)

In Android I placed this into the assets folder.

I then loaded it into JavaNpy:

InputStream stream = context.getAssets().open("data.npy")
Npy npy = new Npy(stream);
float[] npyData = npy.floatElements();

And finally converted it into a TensorBuffer:

int[] inputShape = new int[]{1, 596, 80};   //the data shape before I flattened it
TensorBuffer tensorBuffer = TensorBuffer.createFixedSize(inputShape, DataType.FLOAT32);
tensorBuffer.loadArray(npyData);

I then used this tensorBuffer for inference on my TFLite model.

OscarVanL
  • 701
  • 1
  • 6
  • 21