2

After several weeks of development, I have finally created an AI in an Android App that works using the matrix manipulation API provided by the Nd4j library. These were imported into the project with gradle, following this documentation.

Unfortunately, I'm finding out that Nd4j depends on some release-killingly large run time libraries, especially libnd4jcpu.so, which is about 150mb per abi platform, leading to apk sizes upwards of half a gigabyte! The average app size you'll find on the Google Play store is about 11.5MB.

The compressed download size limit of Android App Bundles on Google Play is 150MB.

The problem of how to reduce the size of the Dl4j dependencies was raised in a previous StackOverflow question. No solution could be offered however, except just to be more selective about what platforms you support. Again, per abi platform, that still means a minimum APK size of at minimum ~200MB.

One has to wonder why the Deeplearning4J community has gone to the effort of supporting Android mobile development in the first place, and why the inevitable problem of runtime library dependency sizes isn't so much as mentioned anywhere in the documentation.

Surely I am missing something here?

2 Answers2

2

"Nd4j" is actually libraries + self contained c++ library compiled to native binaries per platform bundled in to the jar for fast performance. You typically want to strip down those dependencies in your build. You can see how to do that here:

https://github.com/bytedeco/javacpp-presets/wiki/Reducing-the-Number-of-Dependencies

Nd4j relies on javacpp for packaging. In short, you can either specify -Dplatform=android-x86_64 or android-arm64 (depending on the architecture) in your maven/gradle build if you use nd4j-native-platform or you can just use the nd4j-native dependency (no classifier) + the classifier for your platform.

Editing my response a bit sorry I didn't get the time to fully read your question this morning. Let me respond point by point.

First of all this: "One has to wonder why the Deeplearning4J community has gone to the effort of supporting Android mobile development in the first place.."

  1. First of all fair point and I want to work on this. Please be open minded when working with us here a bit. Generally people have specific requirements and work with us on their specific circumstances. Sometimes we help them minimize binary size via #2.

  2. Regarding this: "especially libnd4jcpu.so, which is about 150mb per abi platform, " as it stands generally folks come with different spins of their apps. We've admittedly focused more on op coverage than binary size. There's a minifier we have that could help: https://github.com/eclipse/deeplearning4j/tree/master/libnd4j/minifier I'm happy to help with your use case if you can be more specific, but it's not quite a "just read the docs and go about it on your own" experience.

Surely I am missing something here?

Adam Gibson
  • 3,055
  • 1
  • 10
  • 12
  • 1
    As I said in my question Adam: "No solution could be offered however, except just to be more selective about what platforms you support. Again, per abi platform, that still means a minimum APK size of at minimum ~200MB." - I literally cite the same link. – John Rayner-Hilles Mar 02 '21 at 12:10
  • So it depends on what functionalities you use, your app size will scale with the number of native libraries you use (eg: opencv) if you're just using nd4j-native the native binaries for beta7 are 30 MB (which is still sizable) I understand. We could probably strip it down, if you want a more specific solution we might be able to work with you a bit. – Adam Gibson Mar 02 '21 at 13:04
2

If anyone per chance reading this is faced with the same situation as I: I've switched over to the new org.jetbrains.kotlinx Multik library, which provides the same basic NDArray operations I needed nd4j for. Upon testing it with an app, it adds virtually nothing in size, but has reduced functionality, for example as of yet you cannot invert a matrix with Multik.