I've been trying to find the optimal solution for the 0/1 knapsack problem with big weights and values like {300: 256, 653: 300, 1500: 500}
. From my research, I found out that the best solution relies on DP technique where rows have a format [0..W]
. So for the big weights like 1500, table [0..1500][]
would be a total waste of space. Any recommendations on how to avoid such waste? Is it better to use a recursive algorithm for such cases or there is a way to create a table without allocating this much memory?
Asked
Active
Viewed 354 times
1

ggorlen
- 44,755
- 7
- 76
- 106

Dani Ezzeddine
- 11
- 2
-
Welcome to SO! This is why 1/0 knapsack is pseudo-polynomial--you can blow the problem space up just by adding bits to the capacity of the sack. See [How to understand the knapsack problem is NP-complete?](https://stackoverflow.com/questions/3907545/how-to-understand-the-knapsack-problem-is-np-complete) – ggorlen Mar 24 '21 at 00:11
-
Does this answer your question? [Why is the knapsack problem pseudo-polynomial?](https://stackoverflow.com/questions/4538581/why-is-the-knapsack-problem-pseudo-polynomial) – ggorlen Mar 24 '21 at 00:11
-
You can use a `std::map
>` in c++ to store only those value which are really required in your calculation. Although do keep in mind that it can go to O(n * W) in worst case as well – risingStark Apr 07 '21 at 23:10 -
@ggorlen yes, definitely! Thank you! I decided to go with a simple recursive solution since number of elements is small, just their value can be large. – Dani Ezzeddine Apr 13 '21 at 04:51
-
@risingStark Unfortunately I use NodeJS, but thanks, I hope it helps someone else, who happen to see that:) – Dani Ezzeddine Apr 13 '21 at 04:52