In my recently begun quest to learn C one of the first things I have made is a very basic "sum of 2 numbers program".
/* Basic addition prog */
#include <stdio.h>
int main(void) {
int v_1; /* first var */
int v_2; /* second var*/
int answer; /* Sum of 2 vars*/
printf("Simple addition calculator \n");
printf("enter first number: ");
scanf("%d", &v_1);
printf("enter second number: ");
scanf("%d", &v_2);
answer = v_1 + v_2;
printf("%d\n", answer);
}
When I run this and enter 2 numbers everything runs fine and I will get the correct answer. The output will look like:
enter first number: 1
enter second number: 1
=2
I decided I'd try and break it by using letters instead of numbers expecting at first that I'd input 2 letters and it'd either convert them into numeric values or would error out and crash.
However this isn't what happens instead I get the following:
enter first number: a
enter second number: = 32765
My question isn't so much "how do I fix this" as it is "what is actually happening here?
EDIT Again this question is more about trying to understand what is happening than trying to fix a problem. I'm not looking for "how can this be avoided?" I'm looking for "what causes it and why?"
Tl;Dr I really can simplify what I want to know down to 2 questions.
Why am I getting unique junk data for each letter of the alphabet that I enter? i.e 32764 for every time I enter a and 32767 every-time I enter b ?
Why does the second variable input get skipped ?