I have an array with 5 items. I want the user to choose one item.
devops_tutorials = ["Git","Jenkins","Docker","Ansible","Vagrant"];
I want the user to choose one of the options from values 1 through 5
var user_option = prompt("Which Course? 1.Git | 2.Jenkins | 3.Docker | 4.Ansible | 5.Vagrant");
Now, the typeof user_option is showing as a string. So, On the console when I'm printing the value chosen by the user, should I convert the variable user_option to number? But without typecasting, I dont understand why the below statement is working fine.
console.log("you have chosen" + devops_tutorials[user_option-1]);
And this is also working fine
console.log("You have chosen" +devops_tutorials[Number(user_option)-1]);
I expect javascript to throw an error when I'm passing the variable without typecasting. What's happening internally?