I am learning JavaScript and currently I build simple tip calculator using function and switch statement. Basically, tip 20% of the bill when the bill is less than $50, 15% when the bill is between $50 and $200, and 10% if the bill is > $200. The sample input/argument:
simpleTipCalculator(124)
simpleTipCalculator(48)
simpleTipCalculator(268)
if you calculate manually, the result(expected result):
18.599999999999998
9.600000000000001
26.8
Here is my code so far:
function simpleTipCalculator(bill){
let tip = 0
switch(bill){
case bill > 0 && bill < 50:
tip = bill * 0.2
break;
case bill >= 50 && bill <= 200:
tip = bill * .15
break;
default:
tip = bill * .1
}
console.log(tip)
}
and the result of this function:
12.4
4.800000000000001
26.8
I feel confused about this and then I change my code to this:
function simpleTipCalculator(bill){
let tip = 0
switch(true){
case bill > 0 && bill < 50:
tip = bill * 0.2
break;
case bill >= 50 && bill <= 200:
tip = bill * .15
break;
default:
tip = bill * .1
}
console.log(tip)
}
The result is what I expected.
18.599999999999998
9.600000000000001
26.8
My question is how did this happen? please explain me about that because I don't know what to look for on Google about this