I'm somewhat new to programming, only having taken one course on Python at university and now working my way through Harvard's CS50 OpenCourseware, so please bear with me.
This code compiles fine, no errors, etc. The program is meant to take a user input amount of change and use a simple greedy algorithm to return the fewest number of each U.S. coin it would take to represent that change. Simple enough; however, what's killing me here is that, for some reason, it doesn't always count the pennies.
If I were to enter ".41", I would get "1 Quarters, 1 Dimes, 1 Nickels, and 0 Pennies"; entering ".42" yields "1 Quarters, 1 Dimes, 1 Nickels, and 1 Pennies"; but oddly ".43" yields the correct "1 Quarters, 1 Dimes, 1 Nickels, and 3 Pennies".
It's the inconcistency of this bug that's making it so hard for me to track down. I keep on walking through the code in my head with different inputs to try to pinpoint the problem but it's been a futile effort.
What am I doing wrong?
#include <stdio.h>
#include <cs50.h>
int main(void)
{
printf("How much change is owed? ");
float change = GetFloat();
int quarters = 0;
int dimes = 0;
int nickels = 0;
int pennies = 0;
float coinArray[4] = {.25, .10, .05, .01};
int coinNames[4] = {quarters, dimes, nickels, pennies};
int counter(float coinArray);
{
int x;
for(x = 0; x < 4; x++)
{
while (change >= coinArray[x])
{
change = change - coinArray[x];
coinNames[x]++;
}
}
}
printf("%d Quarters, %d Dimes, %d Nickels, and %d Pennies\n",
coinNames[0], coinNames[1], coinNames[2], coinNames[3]);
}