why does Math.Round() return different results for different decimal numbers (that have differnet accuracy) in JS? for example
console.log(Math.round(-0.5000000000000001) === 1); // true
console.log(Math.round(-0.50000000000000001) === 1); // false, is equal to 0
console.log(Math.round(0.4999999999999999) === 0); // true
console.log(Math.round(0.49999999999999999) === 0); // false, is equal to 1
it works ok for both positive and negative numbers, when the accuracy (number of digits after the decimal) is 16 or below, but for 17 and more it gives the other result