0

So I am new and trying to write a program to get input for an amount of money $XX.XX then display the least amount of coins to make that amount. I have made a program that works but it is often off by a penny when doing some calculations for example when 13.49 and I can't figure out why.

    #include <stdio.h>
    #include <stdlib.h>
    #include <math.h>

    int main()
    {
        int StayOn = 1;
        while (StayOn == 1)
        {
            int x = 0;
            float MoneyStart = 0, RemainingMoney = 0;
            char cName[6][8] = {"Toonie", "Loonie", "Quarter", "Dime", "Nickel", "Penny"};
            float cValue[6] = {2.00, 1.00, 0.25, 0.10, 0.05, 0.01};
            int cCount[6] = {0, 0, 0, 0, 0, 0};

            printf("Please Enter Amount: $");
            scanf("%f", &MoneyStart);
            getchar();

            RemainingMoney = MoneyStart;

            for (x = 0; x < 6; x++)
            {
                cCount[x] = (RemainingMoney / cValue[x]);
                RemainingMoney = fmod(RemainingMoney, cValue[x]);
                printf("%8s  - %3i = $%.2f\n", cName[x], cCount[x], (cValue[x]*cCount[x]));
            }
        }
    }
  • 2
    You might want to read ["Is floating point math broken?"](http://stackoverflow.com/questions/588004/is-floating-point-math-broken). It might become better if you change to using `double`, but generally speaking using floating point types for money will sooner or later lead to rounding errors, that gets compounded every calculation you do. – Some programmer dude Jan 23 '16 at 20:15

1 Answers1

1

It's off because of cumulative rounding errors inside calculations.

You can use double for more precision.

See Joachim comment: c least amount of coins for amount entered

Community
  • 1
  • 1
Ilya
  • 5,377
  • 2
  • 18
  • 33