0

I have worked out two ways of calculating the same thing in my JavaScript. I just wonder which is more efficient in terms of memory usage and processing power:

AveragePL = netArray.reduce((sum,arr) => sum + arr, 0);

or

for (let index in netArray) {
  AveragePL = AveragePL + netArray[index];
}

I realise I could do AveragePL += netArray[index] but I am new to JS, so I am using the full form so that I know what is going on for a moment.

  • If you are new to JS: don't `for (let index in netArray) {AveragePL = AveragePL + netArray[index];}`, do `for (let value of netArray) {AveragePL = AveragePL + value;}`. The first one scans the array for existing indices, *and* then indexes into the array, which is not just more complicated to write, but also slower than picking the values immediately. – tevemadar Dec 30 '22 at 00:04

4 Answers4

0

reduce is more efficent than for iterations u can take a look at here Why you should use reduce instead of loops

Adam
  • 113
  • 1
  • 8
0

It would actually depend on the implementation of the JS engine.

Additionally there are different types of for loops for..in, for..of and the classic for

The classic for is the most optimized so it will blow all others out of the water in most implementations, including the reduce approach which has to execute a function for each iteration.

for (let i = 0, len = netArray.len; i < len; i++){
    AveragePL = AveragePL + netArray[i];
}

In firefox you can see the the following extended benchmark.

See https://jsben.ch/xU8Nd

Gabriele Petrioli
  • 191,379
  • 34
  • 261
  • 317
-1

Using the version of node available to me, it seems that reduce is quite a bit faster. Times are in milliseconds.

node benchmark.js
reduce 3.5796960592269897
for-loop 16.456849932670593
node -v
v14.19.1
const { performance } = require("perf_hooks");

function benchmark(label, fn) {
  const t0 = performance.now();
  fn();
  const t1 = performance.now();
  console.log(label, t1 - t0);
}

let netArray = Array.from({ length: 100000 }, () => Math.random());
benchmark("reduce", function() {
  netArray.reduce((sum, arr) => sum + arr, 0);
});
benchmark("for-loop", function() {
  let AveragePL = 0;
  for (let index in netArray) {
    AveragePL = AveragePL + netArray[index];
  }
});
Jared Beck
  • 16,796
  • 9
  • 72
  • 97
-1

The .reduce() is much faster than a loop, a reduce is always preferred over a loop if you can. Here is a test of an array of 1'000 items, repeating 10'000 times:

const netArray = Array.from(Array(1000).keys());

let start = new Date();
let sumReduce;
for(let i = 0; i < 10000; i++) {
  sumReduce = netArray.reduce((sum,arr) => sum + arr, 0);
}
let timeReduce = new Date() - start;

start = new Date();
let sumLoop;
for(let i = 0; i < 10000; i++) {
  sumLoop = 0;
  for(let index in netArray) {
    sumLoop += netArray[index];
  }
}
let timeLoop = new Date() - start;

console.log({
  sumReduce: sumReduce,
  timeReduce: timeReduce,
  sumLoop: sumLoop,
  timeLoop: timeLoop
});

Output:

{
  "sumReduce": 499500,
  "timeReduce": 77,
  "sumLoop": 499500,
  "timeLoop": 971
}

Times are in msec.

Peter Thoeny
  • 7,379
  • 1
  • 10
  • 20