14

I've written a piece of code that works fine. I want a new array consisting of the elements from myArr in the order specified in orderArr. However, it uses for loop inside another for loop to match array elements.

var myArr = ['a', 'b', 'c', 'd', 'e'];
var orderArr = ['e', 'c'];
var reArr = [];

for (var i = 0; i < orderArr.length; i++) {
  for (var k = 0; k < myArr.length; k++) {
    if (myArr[k] == orderArr[i]) {
      reArr.push(myArr[k]);
    }
  }
}

console.log(reArr);

I've often heard that using for loops inside another for loop is bad practice and even forEach should be avoided.

How else can I re-write this code.

Get Off My Lawn
  • 34,175
  • 38
  • 176
  • 338
Stacy J
  • 2,721
  • 15
  • 58
  • 92
  • 3
    _"I've often heard that using for loops inside another for loop is bad practice"_ From anywhere in particular? While there might be shorter ways of writing this code, chances are internally they'll still use a couple of loops to do their work. – James Thorpe Jan 23 '18 at 22:05
  • 1
    :-) from other developers in the project..they say it is a performance issue. I am not sure how... – Stacy J Jan 23 '18 at 22:06
  • Ask them how they intend to reliably and accurately measure it to show that it's a performance issue...! I'm willing to bet that the single call to `console.log` at the end is far slower than all the loops combined... – James Thorpe Jan 23 '18 at 22:08
  • what is your ultimate goal? nested for loops are the right tool for the right job. In this case you may want Array#includes instead though – Jaromanda X Jan 23 '18 at 22:10
  • Possible duplicate of [Simplest code for array intersection in javascript](https://stackoverflow.com/questions/1885557/simplest-code-for-array-intersection-in-javascript) – Herohtar Jan 23 '18 at 22:11
  • I want a new array consisting of the elements from myArr in the order specified in orderArr – Stacy J Jan 23 '18 at 22:12

9 Answers9

11

I wouldn't necessarily say that using loops inside loops is a bad practice -- in fact Ori Drori beat me to stating that the efficiency of such a practice might simply depend on the size of the data set.

For example, there are many implementations of sorting algorithms and you'll often find loops within loops. However the details of the implementation impact the performance as the data sizes change.

My own personal experience is that when I see nested loops I immediately ask myself if the operation it's performing needs to be optimized with a different algorithm. More often than not in JavaScript (browser) applications the answer is "no" because data sizes are rarely big enough to make an impactful difference.

arthurakay
  • 5,631
  • 8
  • 37
  • 62
9

The complexity of having nested loops in your case is O(n * m) - n the length of orderArr, and m the length of myArr.

This solution complexity is O(n + m) because we're creating the dictionary object using Array#reduce with complexity of O(m), and then filtering the orderArray with a complexity of O(n).

Note1: For small arrays, this shouldn't really matter. So unless you've got a few thousands in both array, you can stay with the nested loops.

Note2: This code assumes that there are no duplicates in myArr. If there are duplicates, the result will differ from that of the nested loops.

var myArr = ['a', 'b', 'c', 'd', 'e'];
var orderArr = ['e', 'c'];
var myArrDict = myArr.reduce(function(r, s) {
  r[s] = true;
  return r;
}, Object.create(null));

var reArr = orderArr.filter(function(s) {
  return this[s];
}, myArrDict);

console.log(reArr);
Ori Drori
  • 183,571
  • 29
  • 224
  • 209
  • A problem with this approach is that it won't preserve duplicates from the original array in the returned array (which the looping code does). – SBFrancies Jan 23 '18 at 22:21
  • @SBFrancies - that's a good point, which I haven't considered. I'll add a note. – Ori Drori Jan 23 '18 at 22:26
2

Personally, I like to have a dedicated correlate function for this particular scenario.

"use strict";

function correlate(
  outer,
  inner, 
  outerKeyOrKeySelector = x => x,
  innerKeyOrKeySelector = x => x
) {
  const outerValues = [...outer];
  const innerValues = [...inner];

  const outerToKey = typeof outerKeyOrKeySelector === 'function'
      ? outerKeyOrKeySelector
      : (x => x[outerKeyOrKeySelector]);

  const innerToKey = typeof innerKeyOrKeySelector === 'function'
      ? innerKeyOrKeySelector
      : (x => x[innerKeyOrKeySelector]);

  const outerKeyed = outerValues.map(value => ({key: outerToKey(value), value});
  const innerKeyed = innerValues.map(value => ({key: innerToKey(value), value});

  return outerKeyed.reduce((results, {key: oKey, value: oValue}) => [
     ...results,
     ...innerKeyed
         .filter(({key: iKey}) => oKey === iKey)
         .map(({value: iValue}) => [oValue, iValue])
    ], []);
}

This basically acts like a JOIN and allows you to correlate the elements two arrays, equating elements directly or by the value of a specified property or projection function.

It may seem like overkill, and YMMV, but I find it very useful.

Applied to the problem at hand:

var reArr = correlate(orderArr, myArr).map(([first, second]) => first);

But it handles complex scenarios just as easily

var correlated = correlate(orders, customers, o => o.customer.name, c => c.name);

correlated.forEach(([order, customer]) => {
  customer.orders.push(order);
});
Aluan Haddad
  • 29,886
  • 8
  • 72
  • 84
1

I don't think there is anything wrong with your code but if you really want to you can eliminate the (visible to you) loops with this:

var reArr = myArr.filter((n) => orderArr.includes(n));

(Will not work in old versions of IE).

SBFrancies
  • 3,987
  • 2
  • 14
  • 37
1

What you can do is combine the two arrays and remove everything but the duplicates using reduce like so:

var myArr = ['a', 'b', 'c', 'd', 'e'];
var orderArr = ['e', 'c'];
// Merge the two arrays
var reArr = myArr.concat(orderArr);

let result = reArr.reduce((arr, val) => {
  // return only equal values that are not already in the new array
  // If the value is in the new array 'indexOf()' will be greater than -1
  // we will then not modify the current array
  //
  // If the value is not in the new array and is equal to the current lookup 
  // then add it to the filter. Once filtered, we can check
  // if the filter contains more than one item if it does
  // then we can add it to the new array
  return reArr.filter(v => v == val && arr.indexOf(val) == -1).length > 1 
    // If it is not in the new array add it using 'concat()'
    // otherwise we will return the current array as is
    ? arr.concat(val) : arr
}, [])

console.log(result);
Get Off My Lawn
  • 34,175
  • 38
  • 176
  • 338
1

To complement some of the good answers here (notably Ori's) I believe that the optimal solution for two very large arrays would be a merge join (this is how a relational database would handle the case. The idea here is that we only bother comparing values we know are approximately the same.

The first step is to sort both arrays (O(n log n) + O(m log m)) then compare the first value in each array. There are 3 possibilities:

A<B
  We discard A and fetch the next value from the same list A came from
A=B
  We add A to the keep list and retrieve the next value from that array
A>B
  We discard B and retrieve the next value from its list

...and repeat the process until one of the lists is depleted giving...

 O(n log n) + O(m log m) + O(least(n,m))
symcbean
  • 47,736
  • 6
  • 59
  • 94
0

Here is one very simple ES5 version - inner loop avoider, provided that your values are not objects:

var myArr = ['a', 'b', 'c', 'd', 'e'];
var orderArr = ['e', 'c'];
var myArrIndex = {};
var reArr = [];

for (var i = 0; i < myArr.length; i++) {
    myArrIndex[myArr[i]] = myArr[i];
};

for (var i = 0; i < orderArr.length; i++) {
    var orderArrayItem = orderArr[i];
    if (myArrIndex[orderArrayItem]) {
        reArr.push(orderArrayItem);
    }
}


console.log(reArr);
mmuncada
  • 528
  • 1
  • 11
  • 31
bhantol
  • 9,368
  • 7
  • 44
  • 81
0

Map myArr into an object. I will add an 'a' to myArr and 'f' to orderArr.

var myArr = ['a', 'a', 'b', 'c', 'd', 'e'];
var orderArr = ['e', 'c', 'f']
var myObj = createObjectFrom(myArr);
var reArr = [];

function createObjectFrom(array) {
   var obj = {};
   var arrValue;

   for (var i = 0; i < array.length; i++) {
     arrValue = array[i];

     if (obj[arrValue]) {
       obj[arrValue]++;
     } else {
       obj[arrValue] = 1;
     }
   }

   return obj;  // { 'a': 2, 'b': 1, 'c': 1, 'd': 1, 'e': 1 }
}

var orderValue;

for(var i=0;i<orderArr.length;i++){
  orderValue = orderArr[i];

  if (myObj.hasOwnProperty(orderValue)) {
     for (var j = 0; j < myObj[orderValue]; j++) {
       reArr.push(orderValue);
     }
   }
}

console.log(reArr);  // [ "c", "e" ]

Instead of looping (6*3=) 18 times, you only need to loop (6+3=) 9 times.

I made the code a little bit more complex to take duplicates into account, hence the second for loop (j).


As an added bonus, here is a video I think can be interesting for you to watch:

https://www.youtube.com/watch?v=XKu_SEDAykw

Rickard Elimää
  • 7,107
  • 3
  • 14
  • 30
0

You could take a Map and collect all same values in an array for later concatination with the result set.

var array = ['a', 'b', 'c', 'd', 'e'],
    order = ['e', 'c'],
    map = new Map,
    result;
    
array.forEach(v => map.set(v, (map.get(v) || []).concat(v)));

result = order.reduce((r, v) => r.concat(map.get(v)), []);

console.log(result);
Nina Scholz
  • 376,160
  • 25
  • 347
  • 392