EDIT: This is not a discussion about how great or broken the floating-point implementation in JS is. This is a very specific case, so please do not mark this as a duplicate of a discussion about floating-point.
This is the script I use to calculate the decimal portion of an amount expressed in cents (Ex. 3.34 is 334 cents):
const amount = 334;
const decimal = Math.trunc(100 * ((amount / 100) % 1).toFixed(2));
console.log(decimal); //34
So far so good. If you change the amount to 329 you get 28, which is wrong:
const amount = 329;
const decimal = Math.trunc(100 * ((amount / 100) % 1).toFixed(2));
console.log(decimal); //28
This is due to the fact that (329/100) % 1 = 0.29000000000000004
instead of 0.30
, because JS floating-point sucks?
But what is really crazy to me is that while developing a loop to see the cases in which that script breaks using a FOR, it does not break on 329!:
for(let x = 325; x < 335; x++) {
const r = Math.trunc((100 * (( x / 100) % 1)).toFixed(2));
console.log(x, r);
}
What am I missing here? Why is working when inside the loop and not working when calling the function directly? And how can you calculate this in a robust and reliable way?