My program prompts the user for 20 integer values to be inputted, then calculates the mean of those values. I inputted the following: 3.5, 6, 9, 12, 15, 18, 21, 24, 1000, 4500, 900, 7, 8, 2, 12, 5, 4, 3 , 2, 1
All are integer values except the first number, but the program tells me the mean is 3.000000. Why exactly does this happen? My furthest reason of explanation is that the scanf function I have for each value prompts for an integer, and so entering 3.5 is rounded to either 3 or 4, but that still doesn't explain the resulting mean of 3.000000.
#include <stdio.h>
int main(void) {
int count, value;
double avg, sum;
count = 0;
sum = 0;
avg = 0;
while (count < 20) {
printf("Enter a positive integer\n");
scanf("%i", &value);
if (value >= 0) {
sum = sum + value;
count = count + 1;
}
else {
printf("value must be positive");
}
}
avg = sum / count;
printf("Average is %lf\n ", avg);
return 0;
}