0

I am learning to get familiar with JS. I'm practicing on an exercise that given the cost of a meal, ie. $100, and I need to calculate the total amount I need to pay after adding 15% of tips and 9.5% of tax.

I wrote two functions below: billTotal() and billTotal2(). However, billTotal() gives a weird output of $124.49999999999999. billTotal2() gives the correct output: $124.5. I tried to use pythontutor.com to track the visualization for what's going on, but I still can't figure it out.

My Question: Does anyone know why billTotal() gives a wrong output? billTotal() and billTotal2() do the exact same thing.

function billTotal(subtotal) {
  var total = subtotal * (1 + 0.15 + 0.095);
  return total;
}

function billTotal2(subtotal) {
  var total= subtotal + (0.15 * subtotal) + (0.095 * subtotal);
  return total;
}

console.log(billTotal(100));
console.log(billTotal2(100));
LED Fantom
  • 1,229
  • 1
  • 12
  • 32
  • It's some kind of weirdness. See [this](https://stackoverflow.com/questions/1036662/weird-javascript-behaviour-floating-point-addition-giving-the-wrong-answer) – Martin Homola Oct 25 '17 at 23:21
  • 1
    Actually, they're not doing the same thing. In the second one you are managing to convert your float to an integer e.g.: .15 * 100 = 15; and .095 * 100 to 9.5 before the addition. Whereas in the first one you have 1 + .15 + .095 which is 1.24999_ float. – Bekim Bacaj Oct 25 '17 at 23:39

0 Answers0