-2

My program prompts the user for 20 integer values to be inputted, then calculates the mean of those values. I inputted the following: 3.5, 6, 9, 12, 15, 18, 21, 24, 1000, 4500, 900, 7, 8, 2, 12, 5, 4, 3 , 2, 1

All are integer values except the first number, but the program tells me the mean is 3.000000. Why exactly does this happen? My furthest reason of explanation is that the scanf function I have for each value prompts for an integer, and so entering 3.5 is rounded to either 3 or 4, but that still doesn't explain the resulting mean of 3.000000.

#include <stdio.h>

int main(void) {
int count, value;
double avg, sum;
count = 0;
sum = 0;
avg = 0;

while (count < 20) {
printf("Enter a positive integer\n");
scanf("%i", &value);

if (value  >= 0) {
sum = sum + value;
count = count + 1;
}

else {
printf("value must be positive");

}
} 
avg = sum / count;
printf("Average is %lf\n ", avg);
return 0;
}
Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
crim
  • 27
  • 4

1 Answers1

3

When you ask scanf to read an integer, it will read digits until it finds a non-digit character. In this case the decimal separator ..

The decimal separator and everything after it will be left in the input buffer. This will lead to problems if you try to read the integers in a loop, as the next call to scanf will see the dot and think no integer were entered and not read anything, which leaves the dot for the next iteration of the loop, and so on and on forever.

If any one of these numbers could be a floating-point value, you should read all as floating point values.

Some programmer dude
  • 400,186
  • 35
  • 402
  • 621