I am writing a function that determines the number 'n' power sets of a list. After processing the problem and working through it I found the relationship between the length of the set and the number of powersets.
2^i
where i is the length of the list.
This formula holds true for:
Powers.powers(new int[]{}); // 1
Powers.powers(new int[]{1}); // 2
Powers.powers(new int[]{1,2}); // 4
Powers.powers(new int[]{1,2,3,4}); // 16
Using this information I wrote out the following code:
if(list.length == 1) return BigInteger.valueOf(2);
if(list.length == 0) return BigInteger.valueOf(1);
return BigInteger.valueOf((long)Math.pow(2, list.length));
This works fine for the first few tests, until it hiccups at array length 100. Fortunately it does calculate the correct value that being 1.2676506e+30, but the expected number of powersets of Arraysize 100 is: 9223372036854775807.
Edit: Adjusted the formula to 2^i, and to clarify I understand how the calculation works, I just don't understand how or why the test case expects 9223372036854775807. It passes an array of length 100 with all values being 0 except for value of index 99 being 100.