Following is a quote by Doug Lea available at this link:
Even though computation may be parallel by default at the instruction level, in Local mode, the observable results of dependency-triggered execution are always equivalent to those of purely step-wise sequential execution, whether or not any allowed optimizations actually occur. The exact relationships between statement order and execution order that maintain the associated uniprocessor semantics don't matter, and cannot even be detected (except possibly by tools such as debuggers). There are no source-level programmer controls available to alter these relationships.
What does he mean by "dependency-triggered execution"?
Some context that might help:
Plain mode applies to syntactic accesses of plain (non-volatile) object fields (as in
int v = aPoint.x
), as well as statics and array elements. It also applies to defaultVarHandle
get
andset
access. Even though it behaves in the same way as always, its properties interact with newVarHandle
modes and operations in ways best explained in terms of a quick review of relevant aspects of processor and compiler design.
Plain mode extends the otherwise unnamed "Local" mode in which all accesses are to method-local arguments and variables; for example, the code for pure expressions and functions. Plain mode maintains local precedence order for accesses, which need not match source code statement order or machine instruction order, and is not in general even a total (sequential) order.