8

I have 4 arrays A, B, C, D of size n. n is at most 4000. The elements of each array are 30 bit (positive/negative) numbers. I want to know the number of ways, A[i]+B[j]+C[k]+D[l] = 0 can be formed where 0 <= i,j,k,l < n.

The best algorithm I derived is O(n^2 lg n), is there a faster algorithm?

Michael J. Barber
  • 24,518
  • 9
  • 68
  • 88
russell
  • 3,668
  • 9
  • 43
  • 56
  • Could you explain what this `O(n^2lgn)` algorithm is? – flight Sep 26 '11 at 10:46
  • 6
    For example I have array A,B,C,D all of them are have n zero (0) so the number of quads which some up to zero is O(n^4) how do you find algorithm which runs in O(n^2 log(n)) to find this quads? is permutation of same numbers ignored? – Saeed Amiri Sep 26 '11 at 10:48
  • @quasiverse it means upper bounded by n-square*log(n) – Suraj Chandran Sep 26 '11 at 10:49
  • @SurajChandran I know. I was asking if he could provide *the algorithm* that runs in `O(n^2lgn)`. – flight Sep 26 '11 at 10:50
  • @Saeed: I was wondering if ordered information might allow you to do something clever (eg sums of only negatives will clearly never equal 0) but most ways I'm thinking of still have an n^4 term in them somewhere... – Chris Sep 26 '11 at 10:54
  • O(N^3 lg n) is obvious: loop through first 3, search 4th. Don't see the better one yet, though. – Tom Zych Sep 26 '11 at 10:54
  • @TomZych That assumes unique elements as pointed out by Saeed. – flight Sep 26 '11 at 10:56
  • @quasiverse, Ok I am explaining. Add A[i] where 0<=i<=n to B[j] where 0<=j<=n You will get n^2 element , store them in Array E[n*n], Do same thing for C[] and D[], then place the sum in stl map- Map. Then for each element i of E[] , check Map[-E[i]], which is number of way we can reach to 0 using element i. Then add all the way and it is the answer. – russell Sep 26 '11 at 10:58
  • We can divide each array into positive and negative numbers, and also into classes by various moduli, which might help. – Tom Zych Sep 26 '11 at 10:58
  • 1
    @Saeed: If you just need to find the number of ways rather than list them all, the runtime can be less than O(n^4). – interjay Sep 26 '11 at 10:59
  • 1
    @russell: why is that n^2 lg n? You don't need to binary-search, you can hash, yes? n^2. (And of course you can hash for what I wrote earlier, too, now that I see it.) – Tom Zych Sep 26 '11 at 11:01
  • @russell, your way is wrong, because you forgot about different numbers sum to equal value, for example 3+(-3)=0 and 1+(-1)=0 and you assume all are same. – Saeed Amiri Sep 26 '11 at 11:04
  • @interjay, may be, but I have no idea about this. – Saeed Amiri Sep 26 '11 at 11:06
  • Yea, I know i can hash, it but isn't it requires some look up time.The stl map take about to O(lgn) time as faras i know , and if i use binary search it also take extra O(lgn) time. So is (n^2lgn) is optimal time?? – russell Sep 26 '11 at 11:07
  • @Saeed: associate a count with each sum. – Tom Zych Sep 26 '11 at 11:07
  • I came to know that from the link- http://stackoverflow.com/questions/222658/multiset-map-and-hash-map-complexity there is no way for hash lookup in O(1) time (all the time). So is (n^2lgn) optimal?? – russell Sep 26 '11 at 11:17
  • 2
    @russell: please post your O(n^2logn) solution as an answer, so it can be upvoted and accessed more easily by future readers. – amit Sep 26 '11 at 16:14
  • O(n^2 lg n) is big O notation. It describes the complexity of the algorithm, how the run time is affected by n. – ctrl-alt-delor Sep 27 '11 at 11:32
  • @russell when you have found a match in the map/hash you have to ask how many times does this number appear in the map/hash – ctrl-alt-delor Sep 27 '11 at 12:15
  • russell: well, practically with a good configured hash you'll have O(1) lookup, but yes, *theoretically* it can be worse. I wouldn't *implement* it with red-black tree to get guaranteed lookup time.. that's just silly. – Karoly Horvath Sep 27 '11 at 13:26

3 Answers3

4

Ok, Here is my O(n^2lg(n^2)) algorithm-

Suppose there is four array A[], B[], C[], D[]. we want to find the number of way A[i]+B[j]+C[k]+D[l] = 0 can be made where 0 <= i,j,k,l < n.

So sum up all possible arrangement of A[] and B[] and place them in another array E[] that contain n*n number of element.

int k=0;
for(i=0;i<n;i++)
{
    for(j=0;j<n;j++)
    {

        E[k++]=A[i]+B[j];


    }
}

The complexity of above code is O(n^2).

Do the same thing for C[] and D[].

int l=0;

for(i=0;i<n;i++)
{
     for(j=0;j<n;j++)
     {

          AUX[l++]=C[i]+D[j];

     }
}

The complexity of above code is O(n^2).

Now sort AUX[] so that you can find the number of occurrence of unique element in AUX[] easily.

Sorting complexity of AUX[] is O(n^2 lg(n^2)).

now declare a structure-

struct myHash
{
  int value;
  int valueOccuredNumberOfTimes; 

}F[];

Now in structure F[] place the unique element of AUX[] and number of time they appeared.

It's complexity is O(n^2)

possibleQuardtupple=0;

Now for each item of E[], do the following

for(i=0;i<k;i++)
 {

     x=E[i];

     find -x in structure F[] using binary search.
     if(found in j'th position)
     {
        possibleQuardtupple+=number of occurrences of -x in F[j];

     }      
 }

For loop i ,total n^2 number of iteration is performed and in each iteration for binary search lg(n^2) comparison is done. So overall complexity is O(n^2 lg(n^2)).

The number of way 0 can be reached is = possibleQuardtupple.

Now you can use stl map/ binary search. But stl map is slow, so its better to use binary search.

Hope my explanation is clear enough to understand.

russell
  • 3,668
  • 9
  • 43
  • 56
  • (+1) Can't say whether this is optimal, but at first blush it seems correct. Thanks for sharing. – NPE Sep 27 '11 at 13:23
  • @aix, this is correct, i submitted this code to uva-online judge and the verdict is accepted. And the algorithm run time can be reduced using extra space But the overall complexity O(n^2lg(n)) can't be reduced. – russell Sep 27 '11 at 13:33
  • Sorry, I overlooked one things, the complexity of the above solution is O(n^2 lg(n^2)). – russell Sep 28 '11 at 09:00
  • 1
    Mathematically, `O(n^2 lg(n^2))` is the same as `O(n^2 lg(n))`, since `lg(n^2)==2*lg(n)`. – NPE Sep 28 '11 at 09:23
-1

I disagree that your solution is in fact as efficient as you say. In your solution populating E[] and AUX[] is O(N^2) each, so 2.N^2. These will each have N^2 elements.

Generating x = O(N)

Sorting AUX = O((2N)*log((2N)))

The binary search for E[i] in AUX[] is based on N^2 elements to be found in N^2 elements.

Thus you are still doing N^4 work, plus extra work generating the intermediate arrays ans for sorting the N^2 elements in AUX[].

I have a solution (work in progress) but I find it very difficult to calculate how much work it is. I deleted my previous answer. I will post something when I am more sure of myself.

I need to find a way to compare O(X)+O(Z)+O(X^3)+O(X^2)+O(Z^3)+O(Z^2)+X.log(X)+Z.log(Z) to O(N^4) where X+Z = N.

It is clearly less than O(N^4) ... but by how much???? My math is failing me here....

-1

The judgement is wrong. The supplied solution generates arrays with size N^2. It then operates on these arrays (sorting, etc).

Therefore the Order of work, which would normaly be O(n^2.log(n)) should have n substituted with n^2. The result is therefore O((n^2)^2.log(n^2))