One question stuck out as "really out of place", so I'll answer that (and that only):
Essentially, why do novices often assume that JavaScript is asynchronous by default? What is the quality of JavaScript that leads beginners to make this assumption compared to Java, where such an assumption is never made?
JavaScript (henceforth referred to as ECMAScript or ES) is predominantly used for two purposes:
- websites (or other browser hosted) and;
- node.js.
In both of these contexts the underlying host infrastructure exists to take advantage of ES's support for first-class functions and closures: Browser Interaction Events, setTimeout, AJAX, Web Workers, and the slew of async support in Node.
(Neither first-class functions nor closures are required for asynchronous programming; they just make it 'really easy' in JavaScript.)
Thus, most programmers never know an environment where the host does not support native constructs that allow asynchronous programming; and support for asynchronous callbacks is used/required to make most programs of interest.
However, if the host does not expose any asynchronous methods, then "asynchronous by default" would be shown to an addition leveraging the ES language features:
ES program execution is always synchronous (this is why a callback must run to completion and cannot be pre-empted) although these tiny synchronous blocks can be interleaved.