I know this has been discussed before, but I want to make sure I understand correctly, what is happening in this program, and why. On page 20 of Dennis Ritchie's textbook, The C Programming Language, we see this program:
#include <stdio.h>
int main()
{
int c;
c = getchar();
while(c != EOF){
putchar(c);
c = getchar();
}
return 0;
}
When executed, the program reads each character keyed in and prints them out in the same order after the user hits enter. This process is repeated indefinitely unless the user manually exits out of the console. The sequence of events is as follows:
The
getchar()
function reads the first character keyed in and assigns its value toc
.Because
c
is an integer type, the character value thatgetchar()
passed to c is promoted to it's corresponding ASCII integer value.Now that
c
has been initialized to some integer value, the while loop can test to see if that value equals the End-Of-File character. Because theEOF
character has a macro value of-1
, and because none of the characters that are possible to key in have a negative decimal ASCII value, the condition of the while loop will always be true.Once the program verifies that
c != EOF
is true, theputchar()
function is called, which outputs the character value contained inc
.The
getchar()
is called again so it reads the next input character and passes its value back to the start of the while loop. If the user only keys in one character before execution, then the program reads the<return>
value as the next character and prints a new line and waits for the next input to be keyed in.
Is any of this remotely correct?