A lot has been said on the topic, however I could not find the exact answer to my question.
JavaScript cannot accurately represent decimal numbers such as 0.1, and this is understandable.
For example this is true because of the rounding error that occurs during the multilication:
0.1 * 3 === 0.30000000000000004
This is fine - all according to the IEEE Standard for Floating-Point Arithmetic (IEEE 754).
What I cannot understand is why other languages that also use the standard give more accurate measurements
0.1 * 3 === 0.3
Is this because of the different roundng rules that they use? https://en.wikipedia.org/wiki/IEEE_floating_point#Rounding_rules or am I missing something?