The following JS code behaves oddly:
var c = 0;
var a = 0;
myTest(8, 5);
function myTest(a,b) {
console.log(a);
a++;
c++;
console.log(a);
return a;
}
console.log(a);
console.log(c);
https://jsfiddle.net/hwns1v4L/
If I take the "a" parameter out of the function, "a" increments by 1 and the third console log returns "a" as 1. Totally normal. But if I keep the "a" as a parameter in myTest function (as in the code above), it gets a value of 8, increments by 1 to 9, and the third console log returns a zero.
What is the explanation for this weird behavior? I am sorry if this is explained in another thread; I am too new for JS to produce really good google queries or understand advanced answers.