When I run a βForβ loop and I log the index used in it with increments of 0.1 , starting from 1, the outcome is not a number with a single decimal digit, but something with multiple decimal digits (code and results are shown here below).
for (let i = 1; i <= 2; i += 0.1) {
console.log(i);
};
What I expected is to see the following series:
1
1.1
1.2
1.3
...
instead what I actually get is:
1
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
1.6000000000000005
1.7000000000000006
1.8000000000000007
1.9000000000000008
This happens in every browser, in the same way either I compile the loop directly in the Console DOM or I code within Visual Studio Code. Does anyone have an explanation about it?