Analyzing some pre-existing third-party Java
code (which cannot be re-written) in an application, I discovered that it is almost entirely structured on long method chains, of the form
public Object process() {
// Do some class-specific stuff
return this.getChild().process();
}
where there is a base class defining the process()
method, inherited by subclasses that redefine that method and the getChild()
call is repeated at the end.
Multiple such chains are created (branched off) at runtime as a result of conditional blocks, some of which may return relatively early (e.g., after 10-20 "links" in the chain), but most commonly the "length" of the chain is much bigger and can even exceed 100 consecutive method calls, before returning the result.
The application uses this code to process large files, with results changing for every invocation of the top level method, usually once every different line of each file.
So, two questions:
- What performance implications should be expected from such a design, compared to a "regular" one?
- Is there any trick to notably improve performance of that code as-is (e.g., by changing some JVM parameter)?