I am trying to print a table of integer squares. The aim is to write a program that pauses after every 24 squares and asks the user to print Enter to continue. This is not too complex, I have "completed" the task at hand in C.
My concern: a seemingly insignificant (but observable) minor issue occurs when I compile & run the following code:
#include <stdio.h>
int main()
{
int i, n;
printf("This Program prints a table of squares. \n");
printf("Enter number of entries in table: ");
scanf("%d", &n);
for(i = 1; i <= n; i++) {
if((i - 1) % 24 == 0 && i > 1) {
printf("Press Enter to continue...");
while(getchar() != '\n')
;
}
printf("%10d%10d\n", i, i*i);
}
}
My output is perfect, exactly what I want, EXCEPT for the first multiple of 24 (i.e. 24). For some reason if I select a large n, the table will print out perfectly but will only require me to press Enter from 48 onwards, the first printf output is weird and it doesn't require the user to press any key (let alone Enter). It looks like this:
24 576
Press Enter to coninue... 25 625
26 676
but then the next time printf is utilised by the code all is seemingly perfect and I am required to press Enter to continue, as it ought to be.. (Sample of healthy output later on in the code and for all subsequent multiples of 24).
48 2304
Press Enter to coninue...
49 2401
The one thing which rectifies this is if I put an extra getchar() function in between printf and while in the for loop. So the code is exactly the same except the for loop now looks like this:
for(i = 1; i <= n; i++) {
if((i - 1) % 24 == 0 && i > 1) {
printf("Press Enter to continue...");
getchar();
while(getchar() != '\n')
;
}
printf("%10d%10d\n", i, i*i);
}
In which case the code runs perfectly. Except that from from 48 onwards I am required to press Enter twice. Presumably this is because of the extra getchar() I have put in place but then why do I only need to press Enter once when i = 24?
My guess is that this has something to do with the (i - 1) % 24
argument but I can't see what the problem is. This is a question from Chapter 7 of KNK C Programming A Modern Approach. I am a self-taught programmer. I hope I have made my question easy to understand and that it elucidates something important about C. Could this be considered implementation-defined behaviour? Perhaps on another machine what I am describing would not have happened? And if not why not?