13

In Marshmallow there was an AOT compiler added with ART. From Android N another compiler JIT was added in addition with AOT.

What are AOT compiler specific job/features and what areJIT compiler job/features?

NoDataDumpNoContribution
  • 10,591
  • 9
  • 64
  • 104
0xAliHn
  • 18,390
  • 23
  • 91
  • 111

3 Answers3

30

In Android Java classes converted into DEX bytecode. The DEX bytecode format is translated to native machine code via either ART or the Dalvik runtimes.

Dalvik is a JIT (Just in time) compilation based engine. There were drawbacks to use Dalvik hence from Android 4.4 (kitkat) ART was introduced as a runtime and from Android 5.0 (Lollipop) it has completely replaced Dalvik. Android 7.0 adds a just-in-time (JIT) compiler with code profiling to Android runtime (ART) that constantly improves the performance of Android apps as they run.

(Dalvik used JIT (Just in time) compilation whereas ART uses AOT (Ahead of time) compilation.)

Just In Time (JIT):

With the Dalvik JIT compiler, each time when the app is run, it dynamically translates a part of the Dalvik bytecode into machine code. As the execution progresses, more bytecode is compiled and cached. Since JIT compiles only a part of the code, it has a smaller memory footprint and uses less physical space on the device.

Ahead Of Time (AOT):

ART is equipped with an Ahead-of-Time compiler. During the app’s installation phase, it statically translates the DEX bytecode into machine code and stores in the device’s storage. This is a one-time event which happens when the app is installed on the device.

Android N includes a hybrid runtime:

There won’t be any compilation during install, and applications can be started right away, the bytecode being interpreted. There is a new, faster interpreter in ART and it is accompanied by a new JIT, but the JIT information is not persisted. Instead, the code is profiled during execution and the resulted data is saved.

Benefits of ART:

  • Apps run faster as DEX bytecode translation done during installation.
  • Reduces startup time of applications as native code is directly executed.
  • Improves battery performance as power utilized to interpreted byte codes line by line is saved.
  • Improved garbage collector.

Drawbacks of ART:

  • App Installation takes more time because of DEX bytecodes conversion into machine code during installation.

  • As the native machine code generated on installation is stored in internal storage, more internal storage is required.

Mitesh Vanaliya
  • 2,491
  • 24
  • 39
10

Compilers need two things to generate performant code: information and resources.

JIT compilers have way more information at their disposal than AOT compilers. Static analysis is impossible in the general case (nearly everything interesting you would want to know about a program can be reduced to either the Halting Problem or Rice's Theorem), and hard even in the special case. JIT compilers don't have this problem: they don't have to statically analyze the program, they can observe it dynamically at runtime.

Plus, a JIT compiler has techniques at its disposal that AOT compilers don't, the most important one being de-optimization. Now, you might think, we is de-optimization important for performance? Well, if you can de-optimize, then you can be over-aggressive in making optimizations that are actually invalid (like inlining a method call that may or may not be polymorphic), and if it turns out that you are wrong, you can then de-optimize back to the un-inlined case (for example).

However, there's the problem of resources: an AOT compiler can take as much time as it wants, and use as much memory as it wants. A JIT compiler has to steal its resources away from the very program that the user wants to use right now.

Normally, that is not a problem. Our today's machines are so ridiculously overpowered, that there are always enough resources at the JIT's disposal. Especially since the JIT will use the most resources when a lot of new code is introduced into the system at once, which is usually during program startup, or when the program transitions between phases (for example, from parsing the configuration files to setting up the object graph, or from finishing configuration to starting the actual work), at which times the program itself typically does not yet use that many resources (especially during program startup). The Azul JCA is a good example. It has 864 cores and 768 GiByte RAM in its biggest configuration (and note that they haven't been sold for quite some time, so that is actually several years old technology). According to Azul's measurements, the JIT uses maybe 50 cores, when it is working very hard. That's still more than 800 cores leftover for the program, the system and the GC.

But your typical Android device doesn't have 1000 cores and a TiByte of RAM. And it is extremely interactive and latency-sensitive, when the user starts, say, WhatsApp, he wants to write a message right now. Not in 500msec, when the JIT has warmed up. NOW.

That's what makes AOT attractive here. Also note that JIT compiling will not only steal resources away from the running program, it will also need battery power, and it will need that every time the program runs, whereas an AOT compiler will only need to spend that power budget once, when the app is installed.

You could go even more extreme and push the compilation off to the app store or even to the developer, like Apple does, but Apple has the advantage of a much more limited set of possible target platforms to consider, so on-device AOT compilation seems a reasonable trade-off for Android.

Muhammad Farhan Habib
  • 1,859
  • 20
  • 23
  • Good answer you know but waaaay too long. – Det Feb 27 '19 at 10:53
  • Great answer. Could even be longer and I would still like to read it. I ask myself if the additional information available to JIT really brings much in performance gain. – NoDataDumpNoContribution May 27 '19 at 09:02
  • @farhan Why doesn't android compile .oat or native binaries directly when building apk So that there would be no compilation required after installing the app? – Siva Jun 26 '21 at 03:04
3

JIT vs AOT

.java -> .class -> .dex(by DX, D8) -> machine byte code

Just In Time(JIT) - Dalvic(register based) - generates machine byte code before execution. It has less memory footprint but has bigger CPU usage(lags), periodicity, battery lifetime

Ahead Of Time(AOT) - Android Runtime(ART) - generates machine byte code during installation. Was introduced in API 19 and became default in API 21. main It has bigger first time launch. ART optimised memory allocation and Garbage Collector(GC) - only one iteration

From API 24 ART uses hybrid approach with AOT and JIT [ClassLoader]

yoAlex5
  • 29,217
  • 8
  • 193
  • 205