So I am looking at the JavaScript MDN re-intro tutorial and got to the point on Floating Point Imprecision. https://developer.mozilla.org/en-US/docs/Web/JavaScript/A_re-introduction_to_JavaScript
Why do they use this example? Wouldn't you get that result in most all languages?