I need to calculate the middle point (average) between two real numbers in JavaScript. The range of the numbers can vary widely, generally between 10000 and 0.0001.
The naive approach
(parseFloat(first) + parseFloat(second)) / 2
gives me unwanted precission errors, i.e.
(1.1 + 0.1) / 2 = 0.6000000000000001
How can I ensure that the result does not have extra decimal spaces? I guess, since there are two and only two inputs, that the result will need to have maximum one more decimal place than the inputs. So, I need:
1000 and 3000 to return 2000 (without decimal spaces)
1234.5678 and 2468.2468 to return 1851.4073
0.001 and 0.0001 to return 0.00055
10000 and 0.0001 to return 5000.00005
0.1 and 1.1 to return 0.6
To clarify: I know all about precision errors and why this happens. What I need is a simple workaround, and I have not been able to find a previous solution on SO.