16

In a lecture about JIT in Hotspot I want to give as many examples as possible of the specific optimizations that JIT performs.

I know just about "method inlining", but there should be much more. Give a vote for every example.

Artiom Gourevitch
  • 786
  • 1
  • 11
  • 22

5 Answers5

15

Well, you should scan Brian Goetz's articles for examples.

In brief, HotSpot can and will:

  1. Inline methods
  2. Join adjacent synchronized blocks on the same object
  3. Eliminate locks if monitor is not reachable from other threads
  4. Eliminate dead code (hence most of micro-benchmarks are senseless)
  5. Drop memory write for non-volatile variables
  6. Replace interface calls with direct method calls for methods only implemented once

et cetera

alf
  • 8,377
  • 24
  • 45
10

There is a great presentation on the optimizations used by modern JVMs on the Jikes RVM site: ACACES’06 - Dynamic Compilation and Adaptive Optimization in Virtual Machines

It discusses architecture, tradeoffs, measurements and techniques. And names at least 20 things JVMs do to optimize the machine code.

Philippe Aubertin
  • 1,031
  • 1
  • 9
  • 22
Patrick
  • 3,790
  • 1
  • 16
  • 12
7

I think the interesting stuff are those things that a conventional compiler can't do contrary to the JIT. Inlining methods, eliminating dead code, CSE, live analysis, etc. are all done by your average c++ compiler as well, nothing "special" here

But optimizing something based on optimistic assumptions and then deoptimizing later if they turn out to be wrong? (assuming a specific type, removing branches that will fail later anyhow if not done,..) Removing virtual calls if we can guarantee that there exists only one class at the moment (again something that only reliably works with deoptimization)? Adaptive optimization is I think the one thing that really distinguishes the JIT from your run of the mill c++ compiler.

Maybe also mention the runtime profiling done by the JIT to analyse which optimizations it should apply (not that unique anymore with all the profile-guided optimizations though).

Voo
  • 29,040
  • 11
  • 82
  • 156
5

There's an old but likely still valid overview in this article.

The highlights seem to be performing classical optimizations based on available runtime profiling information:

  • JITting "hot spots" into native code
  • Adaptive inlining – inlining the most commonly called implementations for a given method dispatch to avoid a huge code size

And some minor ones like generational GC which makes allocating short lived objects cheaper, and various other smaller optimizations, plus whatever else was added since that article was published.

There's also a more detailed official whitepaper, and a fairly nitty-gritty HotSpot Internals wiki page that lists how to write fast Java code that should let you extrapolate what use cases were optimized.

František Hartman
  • 14,436
  • 2
  • 40
  • 60
millimoose
  • 39,073
  • 9
  • 82
  • 134
2

Jumps to equivalent native machine code instead of JVM interpretation of the op-codes. The lack of a need to simulate a machine (the JVM) in machine code for a heavily used part of a Java application (which is the equivalent of an extension of the JVM) provides a good speed increase.

Of course, that's most of what HotSpot is.

Edwin Buck
  • 69,361
  • 7
  • 100
  • 138