I recently came across this question, and was unsure how to solve it. I was given an array of positive integers less than a million, and for each number, I had to compute the largest bit difference it had with any other number in the array, where the bit difference between n1
and n2
was defined as the number of set bits in n1 xor n2
. So for example, the input
3 (011)
5 (101)
4 (100)
should output the array
3 (3 ^ 4 = 7, which has 3 set bits)
2 (5 ^ 3 = 6, which has 2 set bits)
3 (4 ^ 3 = 7, which has 3 set bits)
where each number in the output array corresponds to the maximum bit difference of all pairs that include the input with the same index in the input array (so for instance, the pair used for the output at the first index must include 3, since 3 is the first number in the input array.) The size of the array can be very large, so an O(N^2) solution where I considered each pair was too slow. I suspected that the solution had to do with exploiting the fact that the numbers must be less than a million, since that would mean each number only has 20 bits. However, I was unable to proceed with this idea.