Incorrect input using %c
Consider the following snippet of code:
int main( ){
int x;
char y;
scanf("%d", &x);
scanf("%c", &y);
printf("%d %c", x, y);
}
Behavior:
When you run the above program, the first scanf is called
which will wait for you to enter a decimal number. Say you enter
12(That’s ASCII ‘1’ and ASCII ‘2’). Then you hit the "enter" key to
signal the end of the input. Next the program will execute the second
scanf, except you will notice that the program doesn't wait for you to
input a character and instead goes right ahead to output 12 followed
by a ‘\n’.
Explanation:
Why does that happen? Let’s look at the behavior of the
program step-bystep.
Initially there is nothing in the buffer. When the first scanf() is called, it has nothing
to read, and so it waits. It keeps waiting until you type 1,2, then "enter". Now what's in
the buffer are the character 1, the character 2, and the character ‘\n’. Remember that ‘\n’
signifies the end of input, once all fields have been entered, but it is also stored in the
buffer as an ASCII character. At this point scanf will read the largest decimal input from
the buffer and convert that to an integer. In this example, it finds the string "12" and
converts it to the decimal value twelve and puts it in x. Then scanf returns control back to
the main function and returns the value 1, for being able to convert one entry
successfully. In our example, we do not catch the return value of the scanf in a variable.
For robust code, it is important to check the return value of scanf( ) to make sure that the
user inputted the correct data.
What is now left in the buffer is ‘\n’. Next, the second scanf is
called and it's expecting a character. Since the buffer already has
the ‘\n’ character in it, scanf sees that, takes it from the buffer,
and puts it in y. That's why when you execute the printf afterwards,
12 and “enter” are printed to the screen.
Solution:
Moral of the story is, enter is a character like any other,
and is inputted to the buffer, and consumed from the buffer by %c just
like any other ASCII character. To fix this, try using this code
instead:
int main( ){
int x;
char y;
scanf("%d", &x);
scanf("\n%c", &y); /* note the \n !! */
printf("%d %c", x, y);
}
**
How does this fix the problem?
** So you again type ‘1’,’2’,’\n’. The first scanf reads "1" and "2", converts that to the decimal twelve and leaves the ‘\n’ in the buffer.
The next scanf now expects a ‘\n’ at the beginning of the next input.
It finds the ‘\n’ in the buffer, reads it, and discards it. Now the
buffer is empty and scanf is waiting on the user to input a character.
Say the user inputs ‘c’, followed by the enter key. ‘c’ is now
assigned to y, NOT "enter". Therefore printf will output "12 c" to the
screen. NOTE: there is again a ‘\n’ sitting in the queue now. So if
you need to do another scanf for a single character, you will have to
"consume" that '\n' before taking another character in from the user.
This is not an issue for any other format specifier, as they all ignore white spaces before
the input.