3

This was an Amazon interview question I had an my answer was

function intersection ( A , B ) 
{
    var C = [];
    for ( var a in A ) if ( B.indexOf(a) != -1 ) C.push(a);
    return C;
}

and he asked what the order of complexity was and I said, and I quote exactly,

O(m * n) where m=A.length and n=B.length

and he was saying there's a better way to do it and I was like WTF??????? He was saying use A and B as Objects and I was like

"But you said these were arrays That was your question!!!!"

Can someone help me out here?

  • I'm not sure how 'use as objects' helps, but if you make a lookup-table (dictionary) from `B` before you start, then you can do it in O(n). – Lee Sep 25 '15 at 16:30
  • 3
    If you're in another JavaScript interview, don't use `for ... in` to loop through arrays. – Pointy Sep 25 '15 at 16:31
  • 1
    btw there are lots of questions on SO for this question, eg: http://stackoverflow.com/questions/1885557/simplest-code-for-array-intersection-in-javascript – Lee Sep 25 '15 at 16:32
  • Creating a looking up table from B would require touching each element of B plus the complexity of inserting into the dictionary. The complexity would remain O(m * n) – Jon Trauntvein Sep 25 '15 at 16:32
  • 1
    @JonTrauntvein no it would be O(n+m) which is really just O(n) – Lee Sep 25 '15 at 16:33
  • 2
    If `A` and `B` are arrays, this won't do what it pretends to. In `for (var a in A) { ... }` the variable `a` will hold the indices. – Andreas Sep 25 '15 at 16:33

1 Answers1

6

If you know that the array values are strings or numbers, you can create an object that's got the values as property names and truthy values for each. Then you can use simple object lookup in a pass through the second array.

Something like:

function intersection ( A , B ) 
{
    var m = A.reduce(function(m, v) { m[v] = 1; return m; }, {});
    return B.filter(function(v) { return m[v]; });
}

edit — to remove duplicates from the result, another .reduce() pass could be used:

function intersection ( A , B ) 
{
    var m = A.reduce(function(m, v) { m[v] = 1; return m; }, {});
    return B.reduce(function(rv, v) {
      if (!rv.m[v]) {
        rv.m[v] = 1;
        rv.l.push(v);
      }
      return rv;
    }, {m:{}, l:[]}).l;
}
Pointy
  • 405,095
  • 59
  • 585
  • 614
  • 1
    and If the values are objects but reference equality is all right, you can use a Set() – Touffy Sep 25 '15 at 16:33
  • @Touffy yes good point, in fact you could use a Set anyway. I haven't fully acclimated to ES2015 yet :) – Pointy Sep 25 '15 at 16:34
  • 1
    Note that in practice, if A.length or B.length is small, you might actually get better performance with an array lookup: http://jsperf.com/key-or-array-search/2 (from http://stackoverflow.com/questions/26353417/javascript-object-vs-array-lookup-performance) – nrabinowitz Sep 25 '15 at 16:38
  • @nrabinowitz yes, somehow modern browsers do `.indexOf()` incredibly quickly. – Pointy Sep 25 '15 at 16:41
  • Or they have more overhead than expected for object key lookup. – nrabinowitz Sep 25 '15 at 16:42
  • wrong result with `[1, 2]`, `[1, 1]` – solimanware Feb 17 '17 at 19:29
  • @Microsmsm right because it doesn't eliminate duplicates. A pass through `.reduce()` after the `.filter()` would deal with that. Answer updated. – Pointy Feb 18 '17 at 00:11