Please find the question context as follows:
Write a program that takes in a floating point value representing an amount, for example 2.8 indicating 2 dollars and 80 cents. The program should then display the minimum number of coins required to repay the amount in coins. Assume that the user enters a value above 0 and below 10.
#include <stdio.h>
int main(void) {
double amt;
printf("Enter amount:");
scanf("%lf", &amt);
int amt_cents;
amt_cents = amt * 100;
int dollar_1;
dollar_1 = amt_cents / 100;
amt_cents = amt_cents - (dollar_1 * 100);
int cents_50;
cents_50 = amt_cents / 50;
amt_cents = amt_cents - (cents_50 * 50);
int cents_20;
cents_20 = amt_cents / 20;
amt_cents = amt_cents - (cents_20 * 20);
int cents_10;
cents_10 = amt_cents / 10;
amt_cents = amt_cents - (cents_10 * 10);
int cents_05;
cents_05 = amt_cents / 5;
amt_cents = amt_cents - (cents_05 * 5);
int cents_01;
cents_01 = amt_cents / 1;
amt_cents = amt_cents - (cents_01 * 1);
if (dollar_1 != 0) {
printf("Number of 1$: %d\n", dollar_1);
}
if (cents_50 != 0) {
printf("Number of 50c: %d\n", cents_50);
}
if (cents_20 != 0) {
printf("Number of 20c: %d\n", cents_20);
}
if (cents_10 != 0) {
printf("Number of 10c: %d\n", cents_10);
}
if (cents_05 != 0) {
printf("Number of 5c: %d\n", cents_05);
}
if (cents_01 != 0) {
printf("Number of 1c: %d\n", cents_01);
}
}
Output:
Enter amount:1.1 Number of 1$: 1 Number of 10c: 1
Enter amount:2.1 Number of 1$: 2 Number of 10c: 1
Enter amount:3.1 Number of 1$: 3 Number of 10c: 1
Enter amount:4.1 Number of 1$: 4 Number of 5c: 1 Number of 1c: 4
Enter amount:5.1 Number of 1$: 5 Number of 5c: 1 Number of 1c: 4
Enter amount:6.1 Number of 1$: 6 Number of 10c: 1
Question: Why does the values for 4.1 and 5.1 work differently compared to all other values within 0 - 10? By calculating manually down the code, it appears that 4.1 and 5.1 should be consistent with all other cases with producing only a value of 1 for 10c, but that is not the case when the programme is executed.