I am wondering what's going on behind the scenes in for...in vs. for...of loops in JavaScript. I accidentally switched them up on two similar instances, where one worked and one didn't, and I'm trying to figure out why the one that worked worked while the one that didn't didn't.
The problems involved doing basic coding exercises, listing first all odd and then all even numbers of an array containing integers from 1-10 inclusive. Details of what I wrote are below.
For the first one, to list all odd numbers, I wrote:
let numbers = [1,2,3,4,5,6,7,8,9,10]
function displayOddNumbers() {
for (num in numbers) {
num % 2 != 0 ? console.log(num) : console.log();
};
};
This worked, so I moved on with my life, not realizing I'd unwittingly used a for...in loop instead of a for...of loop, and went to the next exercise, which asked to list all even numbers from the same array. I wrote:
function displayEvenNumbers() {
for (num in numbers) {
num % 2 == 0 ? console.log(num) : console.log();
};
};
This did not work; it listed 0,2,4,6,8. When I changed it to
function displayEvenNumbers() {
for (num in numbers) {
num % 2 != 0 ? console.log(numbers[num]) : console.log();
};
};
then it worked.
But what's going on? I realize I should have used a for...of loop to iterate over arrays, but what is going on behind the scenes to allow this code to work sometimes, but not other times, seemingly counterintutively?