0

Given a list of denomination of coins, I need to find the minimum number of coins required to get a given value.

My approach using greedy algorithm,

Divide value by max denomination, take remainder value and divide by second maximum denomination and so on till be get required value.

But this approach fails for some cases.

I want to know

  1. Approach which works for all cases
  2. Why greedy approach fails?

Example where approach fails.

Coins Denomination (1,3,4,5)

Value Required 7

Using Greedy Approach

(7/5)=1 and +2 because 3 and 4 can't be used so we need to use 2 1's value coins. So total of 3 coins.

However optimal value is 4+3 ( 2 coins).

coder hacker
  • 4,819
  • 1
  • 25
  • 50
  • 1
    Can you give example please ? – Ved May 19 '14 at 09:00
  • 2
    Greedy approach will fail when denominations are {4,3,1} and the amount is 6: this will yield (4,1,1) when it should yield (3,3) – Bas May 19 '14 at 09:02
  • 1
    Have a look at http://stackoverflow.com/questions/13557979/why-does-the-greedy-coin-change-algorithm-not-work-for-some-coin-sets?rq=1. – Zeta May 19 '14 at 09:02

2 Answers2

3

There is classic solution for the problem you described - a kind of knapsack problem. It can be solved using dynamic programming approach. For the clear explanation you may follow this tutorial

juver
  • 236
  • 2
  • 8
0

Lets say, you want to get the value "2,10€" and have got a coins of value 2, 1, 0.50, 3 times 0.20.

Greedy will always take the biggest coin first, so it will choose the coin of value 2. Now it is no longer possible to reach 2,10 as you do not have a coin of value 0.10 even though it would be possible if you would have choosen 1, 0.50 and the 3 0.20 Coins.

In this case (just as in most cases) I would recommend you to use Backtracking which would not fail in this case.

Prior99
  • 619
  • 5
  • 13