27

I am considering the below two approaches for building an array of objects:

Approach 1 (list all properties, even if duplicated among objects):

const employees = [
  {
    company: 'ABC',
    country: 'IN',
    zip: 123,
    employeeId: 123,
    employeeName: 'p'
  },
  {
    company: 'ABC',
    country: 'IN',
    zip: 123,
    employeeId: 456,
    employeeName: 'q'
  },
  {
    company: 'ABC',
    country: 'IN',
    zip: 123,
    employeeId: 789,
    employeeName: 'r'
  }
];

Approach 2 (avoid duplication with the spread operator):

const commonParams = {
  company: 'ABC',
  country: 'IN',
  zip: 123
};

const employees = [
  {
    ...commonParams,
    employeeId: 123,
    employeeName: 'p'
  },
  {
    ...commonParams,
    employeeId: 456,
    employeeName: 'q'
  },
  {
    ...commonParams,
    employeeId: 789,
    employeeName: 'r'
  }
]

Approach 2 is more succint, and adding a new property that is common to all array elements would be much easier (and less prone to errors).

However, in case of a large commonParams object, does approach 2 (using the spread operator) affect performance as compared to approach 1?

Would the spread operator loop through each of the properties of the commonParams object for each of the objects in the employees array?

Sigi
  • 1,784
  • 13
  • 19
Boney
  • 2,072
  • 3
  • 18
  • 23

5 Answers5

38

Yes, spreading a variable which refers to an object into another object requires the interpreter to look up what the variable refers to, and then look up all the enumerable own properties (and the associated values) of the object that gets spreaded so as to insert into the new object. This does indeed take a bit of processing power.

But, on modern computers, and on modern JS engines, the processing power required is next to nothing; what does it matter, when millions of instructions can be processed each second? A handful of key-value pairs is nothing to worry about.

Unless you've identified that you're spreading an object with tons of key-value pairs, and it's actually causing a performance bottleneck, it would be a better idea to avoid premature optimization and aim to write clean, readable code instead (which may well invoke using spread syntax often). For a large employees array, the second approach is more readable than the first.

(though, you also might consider using .map, to keep the code even DRY-er:)

const employeesInitial = [
  {
    employeeId: 123,
    employeeName: 'p'
  },
  {
    employeeId: 456,
    employeeName: 'q'
  },
  {
    employeeId: 789,
    employeeName: 'r'
  }
];
const employees = employeesInitial.map((obj) => ({ ...obj, ...commonParams }));
CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
  • 16
    beware of using spread with an array.reduce(). I suspect it leads to O(n^2) behavior or worse. with an array of size 2000, the following code takes over 7 secs on my machine: let phoneBook=inputs.reduce((acc,entry) => { let [name,phone] = entry.trim().split(' '); return {...acc, [name]:phone}; },{}); – hope Dec 31 '19 at 03:48
  • 3
    whereas using the following takes 0.07s (100-fold difference): let phoneBook=inputs.reduce((acc,entry) => { let [name,phone] = entry.trim().split(' '); acc[name] = phone; return acc; },{}); – hope Dec 31 '19 at 03:55
  • 1
    Yep, your first code is `O(n^2)` - a loop inside a loop can cause problems. – CertainPerformance Dec 31 '19 at 04:02
  • 4
    Just a note, I believe `...obj, ...commonParams` means that in a parameter collision, the common one would win. Whereas in the OP he puts `...commonParms` first, so any specific settings would override them. – Matt Mc Sep 21 '20 at 00:02
21

The cost of spreading is significant. We're talking 2 orders of magnitude here.

const { x, y } = z

z = { x, y: y + 1 } // faster
z = { ...z, y: y + 1 } // slower

While they both accomplish similar things they are very different in their performance characteristics. But it will depend, if and how your JavaScript is transpiled.

For example, Babel will actually emit something which is similar to the faster variant if you target ES2015 but if you target ES2017 you'll get the slower variant, as-is. If you target ECMASCRIPT_2018 with the Google Closure Compiler you get the slower variant. With the TypeScript compiler you end up with twice as many objects because it does nested Object.assign calls.

While spreading is slower you're still getting a lot of ops per second. It's just that if you do it the boring way you'll get a lot more ops per second.

I put together a jsperf example to illustrate this.

https://jsperf.com/the-cost-of-spreading/1

If you have a hot code path that does spreading, consider direct construction. Otherwise, don't bother.

John Leidegren
  • 59,920
  • 20
  • 131
  • 152
  • 3
    I like the mention about "hot code path" in this answer as opposed to the accepted answers' twist on extremely large objects. It is quite uncommon to have one extremely large object as opposed to many smaller ones in an array that all require defaults of some kind. +1 for that! – SidOfc Dec 03 '19 at 16:08
  • https://jsperf.com/the-cost-of-spreading/1 -> `This Deployment has been disabled.` – tom10271 Jan 30 '23 at 11:51
  • @tom10271 unfortunate. You can recreate this benchmark quite easily. Run it a 100,000 times and you'll note that there's a big difference in performance. The reason for this is that the JavaScript engine has to create an iterator when using spread syntax. There's no way around that and that is why there's a difference. Direct construction let's (at least V8) take a shortcut. It's the same with arrays, direct construction is faster because it won't over allocate. Minor stuff but good to know if you're chasing a performance target. – John Leidegren Jan 31 '23 at 18:34
13

Time to run second approach will be longer (even if very little on modern computers) as interpreter has to iterate over keys of commonParams and copy them to each object.

Wrote a benchmark to find difference which is almost zero for small objects.

function runFirstApproach(){
  const employees1 = [
    {
      company: 'ABC',
      country: 'IN',
      zip: 123,
      employeeId: 123,
      employeeName: 'p'
    },
    {
      company: 'ABC',
      country: 'IN',
      zip: 123,
      employeeId: 456,
      employeeName: 'q'
    },
    {
      company: 'ABC',
      country: 'IN',
      zip: 123,
      employeeId: 789,
      employeeName: 'r'
    }
  ];
}

function runSecondApproach() {
  const commonParams = {
    company: 'ABC',
    country: 'IN',
    zip: 123
  };

  const employees2 = [
    {
      ...commonParams,
      employeeId: 123,
      employeeName: 'p'
    },
    {
      ...commonParams,
      employeeId: 456,
      employeeName: 'q'
    },
    {
      ...commonParams,
      employeeId: 789,
      employeeName: 'r'
    }
  ]
}

function runBenchmarkWithFirstApproach(){
  console.log("Avg time to run first approach -> ", getAvgRunTime(runFirstApproach, 100000))
}

function runBenchmarkWithSecondApproach(){
  console.log("Avg time to run second approach ->", getAvgRunTime(runSecondApproach, 100000))
}

function getAvgRunTime(func, rep){
  let totalTime = 0;
  let tempRep = rep;
  while(tempRep--) {
    const startTime = Date.now();
    func();
    const endTime = Date.now();
    const timeTaken = endTime-startTime;
    totalTime += timeTaken;
  }
  return totalTime/rep;
}

runBenchmarkWithFirstApproach();
runBenchmarkWithSecondApproach();
Anurag Awasthi
  • 6,115
  • 2
  • 18
  • 32
10

In case someone is stumbling uppon this question while wondering about array spread operations instead of objects:

I benched different methods to accomplish:

const clone = [...original]

var original = [];
var clone = [];

for (var i = 0; i < 10000000; i++) {
    original.push(1);
}
var cycle = 0;

var spreadTime = [];
var mapTime = [];
var forTime = [];
var reduceTime = [];
var sliceTime = [];
var arrayFromTime = [];

while (cycle < 10) {
  var d = Date.now();
  clone = [];
  clone = [...original];
  spreadTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  clone = original.map((entry) => entry);
  mapTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  for (var i = 0; i < original.length; i++) {
      clone[i] = original[i];
  }
  forTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  clone = original.reduce((next, e) => {
      next.push(e);

      return next;
  }, []);
  reduceTime.push(Date.now() - d);
  
  d = Date.now();
  clone = [];
  clone = original.slice();
  sliceTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  clone = Array.from(original);
  arrayFromTime.push(Date.now() - d);

  cycle ++;
  document.getElementById("cycle").innerHTML = cycle;
  document.getElementById("spreadTime").innerHTML = spreadTime.reduce((a,b) => a + b, 0) / spreadTime.length;
  document.getElementById("mapTime").innerHTML = mapTime.reduce((a,b) => a + b, 0) / mapTime.length;
  document.getElementById("forTime").innerHTML = forTime.reduce((a,b) => a + b, 0) / forTime.length;
  document.getElementById("reduceTime").innerHTML = reduceTime.reduce((a,b) => a + b, 0) / reduceTime.length;
  document.getElementById("sliceTime").innerHTML = sliceTime.reduce((a,b) => a + b, 0) / sliceTime.length;
  document.getElementById("arrayFromTime").innerHTML = arrayFromTime.reduce((a,b) => a + b, 0) / arrayFromTime.length;
}
<View>
  <h1>cycle <span id="cycle"></span></h1>
  spread: <span id="spreadTime"></span> ms
  <br/>
  map: <span id="mapTime"></span> ms
  <br/>
  for: <span id="forTime"></span> ms
  <br/>
  reduce: <span id="reduceTime"></span> ms
  <br/>
  slice: <span id="sliceTime"></span> ms
  <br/>
  arrayFrom: <span id="arrayFromTime"></span> ms
  <br/>
</View>
Nestoro
  • 787
  • 6
  • 17
1

Just wanted to add to Nestoro's great benchmark the addition of ForOf, ForIn and reduceWithSpread. I added the reduceWithSpread since I see it too many times and because of that I had to lower the number from 10m to 100k otherwise the page stuck...

var original = [];
var clone = [];

for (var i = 0; i < 100000; i++) {
    original.push(1);
}
var cycle = 0;

var spreadTime = [];
var mapTime = [];
var forTime = [];
var forOfTime = [];
var forInTime = [];
var reduceTime = [];
var reduceSpreadTime = [];
var sliceTime = [];
var arrayFromTime = [];

while (cycle < 10) {
  var d = Date.now();
  clone = [];
  clone = [...original];
  spreadTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  clone = original.map((entry) => entry);
  mapTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  for (var i = 0; i < original.length; i++) {
      clone[i] = original[i];
  }
  forTime.push(Date.now() - d);
  
  d = Date.now();
  clone = [];
  for (var i of original) {
      clone.push(i);
  }
  forOfTime.push(Date.now() - d);
  
  d = Date.now();
  clone = [];
  for (var i in Object.keys(original)) {
      clone.push(original[i]);
  }
  forInTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  clone = original.reduce((next, e) => {
      next.push(e);

      return next;
  }, []);
  reduceTime.push(Date.now() - d);
  
  d = Date.now();
  clone = [];
  clone = original.reduce((next, e) => {
      return [...next, e];
  }, []);
  reduceSpreadTime.push(Date.now() - d);
  
  d = Date.now();
  clone = [];
  clone = original.slice();
  sliceTime.push(Date.now() - d);

  d = Date.now();
  clone = [];
  clone = Array.from(original);
  arrayFromTime.push(Date.now() - d);

  cycle ++;
  document.getElementById("cycle").innerHTML = cycle;
  document.getElementById("spreadTime").innerHTML = spreadTime.reduce((a,b) => a + b, 0) / spreadTime.length;
  document.getElementById("mapTime").innerHTML = mapTime.reduce((a,b) => a + b, 0) / mapTime.length;
  document.getElementById("forTime").innerHTML = forTime.reduce((a,b) => a + b, 0) / forTime.length;
  document.getElementById("forOfTime").innerHTML = forOfTime.reduce((a,b) => a + b, 0) / forOfTime.length;
  document.getElementById("forInTime").innerHTML = forInTime.reduce((a,b) => a + b, 0) / forInTime.length;
  document.getElementById("reduceTime").innerHTML = reduceTime.reduce((a,b) => a + b, 0) / reduceTime.length;
  document.getElementById("reduceSpreadTime").innerHTML = reduceSpreadTime.reduce((a,b) => a + b, 0) / reduceSpreadTime.length;
  document.getElementById("sliceTime").innerHTML = sliceTime.reduce((a,b) => a + b, 0) / sliceTime.length;
  document.getElementById("arrayFromTime").innerHTML = arrayFromTime.reduce((a,b) => a + b, 0) / arrayFromTime.length;
}
<View>
  <h1>cycle <span id="cycle"></span></h1>
  spread: <span id="spreadTime"></span> ms
  <br/>
  map: <span id="mapTime"></span> ms
  <br/>
  for: <span id="forTime"></span> ms
  <br/>
  forOf: <span id="forOfTime"></span> ms
  <br/>
  forIn: <span id="forInTime"></span> ms
  <br/>
  reduce: <span id="reduceTime"></span> ms
  <br/>
  reduce with spread: <span id="reduceSpreadTime"></span> ms
  <br/>
  slice: <span id="sliceTime"></span> ms
  <br/>
  arrayFrom: <span id="arrayFromTime"></span> ms
  <br/>
</View>
Kfir Erez
  • 3,280
  • 2
  • 18
  • 17
  • It stuck for me anyway. Reducing to 10k shows that `reduceWithSpread` is around 600x slower (67.5ms) compared to plain `spread` which clocks at <0.1ms. – redOctober13 Aug 29 '23 at 19:55