Note: I know this question has been asked here, but I didn't properly understand the solution, and I don't have enough points to comment to clarify my query. (I'm a beginner on SO)
Question: Given an array, of size n, divide the array into 2 non contiguous subarrays of size n/2 & n/2 (if n is even), and (n-1)/2 & (n+1)/2, (if n is odd), such that the absolute difference between the sums of the 2 subarrays is minimum.
Example:
4 5 3 55 -3 23 3 20 100 90
Answer: 0.
Explanation: {100, 20, 23, 4, 3}
and {5, 55, -3, 3, 90}
, both of which sum to 150.
I know greedy approach won't work for this, and feel that this involves dynamic programming. However, I am unable to model a solution that ensures that the subarrays being considered are of equal size, my solution generates minimum difference subarrays of arbitrary size, which is a general partition problem.
I am also unsure whether making a table is a good idea, because the numbers can be huge. (Although its certain that all input is within bounds for int).
Can anyone provide me an algorithm that I could apply?
Here is the link to the question that has already been asked (and answered): Dividing an array into two subsets of equal sizes having minimum difference in sum of values.