In javascript, when I calculate 0.15*6
, I get 0.89999999
instead of 0.9
I understand that this is due to the floating point arithmetic. But how can I avoid such problem and get correct result?
To add the relevance of this question, I will add my use case here.
I am provided with a min, max and step value all can be integer or floating point numbers. The input range min - max
is not necessarily in multiple of the step
value.
If the range is not a multiple of step, my task is to extend the range by adjusting (increasing) max
value.
Example: min = 0
, max = 0.85
, step = 0.15
Here the range = 0.85 - 0
= 0.85
, and this is not multiple of step value 0.15
. So I have to extend the max
value.
I am using the below formula to get adjusted max value:
max = Math.ceil((max - min) / step) * step + min;
I expect this to give
>> max = Math.ceil((0.85 - 0) / 0.15) * 0.15 + 0
>> max = Math.ceil(5.666666667) * 0.15
>> max = 6 * 0.15
>> max = 0.9 // <--- But, I get 0.8999999999999999 here
Please note that the number of decimal places are not known, therefore rounding is (probably) not an option.