I have a small problem understanding the coin change problem in dynamic programming. Simply put, I have to change a sum using a minimum number of coins.
I have n denominations of coins of values 1 = v1 < v2 < ... < vn, and we note M(j) the minimum number of coins required to make change for amount j.
In the above formula I don't understand what M(j-vi) means. vi has to be the maximum value of the coins used in j-1?