I am a newbie to Javascript and I'm not sure why my code works. I learn through Codecademy and here is my code:
var orderCount = 0
function takeOrder(topping, crustType) {
orderCount = orderCount + 1;
console.log('Order: ' + crustType + ' pizza topped with ' + topping);
console.log(getSubTotal(orderCount));
}
function getSubTotal(itemCount) {
return itemCount * 7.5
}
takeOrder('peperoni', 'thin');
takeOrder('extra Cheese', 'medium')
takeOrder('Bacon', 'EXTRA THICK')
I get the output I want, which is:
Order: thin pizza topped with peperoni 7.5
Order: medium pizza topped with extra Cheese 15
Order: EXTRA THICK pizza topped with Bacon 22.5
But why? How do Javascript knows how many orders there are in the code?
My guess is that because orderCount = orderCount + 1;
and:
takeOrder('peperoni', 'thin');
takeOrder('extra Cheese', 'medium');
takeOrder('Bacon', 'EXTRA THICK');
But, I'm really not sure. I would much rather know why my code works :)