3

I've spent some time recently plying around with transducers (tool in functional programming meant to improve performance without losing readability/flexibility of code) and when I came to testing their actual speed, I got some quite disappointing results. Consider:

const inc = x => x + 1;

const isEven = x => x % 2 === 0;

// simplest, shortest way I would be comfortable with if performance wasn't an issue

const mapFilter = xs => xs.filter(isEven).map(inc);

// transducers way

// function composition
const compose = (...fns) => x => fns.reduceRight((y, f) => f(y), x);

const map = f => step => (a, c) => step(a, f(c));
const filter = p => step => (a, c) => (p(c) ? step(a, c) : a);

// basic reducer for building array
const build = (acc, x) => {
  acc.push(x);
  return acc;
};

// transducer, it doesn't create intermediate arrays hence should theoretically be faster 
const transducers = xs =>
  xs.reduce(compose(filter(isEven), map(inc))(build), []);

// native loop for comparison
const nativeLoop = data => {
  const result = [];
  const l = data.length;
  for (let i = 0; i < l; i++) {
    const x = data[i];
    if (isEven(x)) result.push(inc(x));
  }
  return result;
};

const data = Array(1000).fill(1);

const base = ["simplest, chained map and filter", () => mapFilter(data)];
const alternative = ["composed transducers", () => transducers(data)];
const alternative2 = ["native loop", () => nativeLoop(data)];

/* console.log(Benchmark) */
console.log("Running benchmarks....");

const suite = new Benchmark.Suite();
suite
  .add(...base)
  .add(...alternative)
  .add(...alternative2)
  .on("cycle", function(event) {
  console.log(String(event.target));
})
  .on("complete", function() {
  console.log("Fastest is " + this.filter("fastest").map("name").join(", "));
})
// run async
  .run({ async: true });
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.15/lodash.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/benchmark/2.1.4/benchmark.min.js"></script>

I would expect that the order of performance will be

native loop > transducers > chained map/filter

Meanwhile, besides native approach which is way faster than any other, it turned out, to my great astonishment, that reduce/transduce way is much slower than using map/filter and creating intermediate arrays (slower, like an order of magnitude in Chrome). Could someone explain to me the origin of this result?

Jason Aller
  • 3,541
  • 28
  • 38
  • 38
  • 2
    When you are interested in micro optimization just stick with native loops. There is nothing wrong about it. –  Feb 18 '20 at 20:32
  • 2
    My guess is builtin `Array#map` and `Array#filter` and so forth are heavily optimized native code but transducer suffers from a lot of function call overhead. Gotta look the function up in the symbol table, create a stack frame, push the arguments, jump, return result... If performance matters, I agree with bob--stick to native and naive code. Even if performance isn't important, the transducer doesn't seem readable to me, although I'm no Schemer. `const mapFilter = xs => xs.filter(isEven).map(inc);` is idiomatic, direct and non-clever. – ggorlen Feb 18 '20 at 21:03
  • 1
    As an aside, replace `build` with `const build = (acc, x) => { return [...acc, x]; };` to make it a pure function. – Jawi Feb 18 '20 at 21:27
  • @MichałKaczanowicz You are right. It's just that I am fed up with all this performance related questions about FP. But your specific quation doesn't fall into this category. My bad. –  Mar 11 '20 at 10:33
  • I think you will enjoy [_How to chain, map, and filter functions in the correct order?_](https://stackoverflow.com/a/44211131/633183) – Mulan Nov 02 '20 at 13:04

2 Answers2

6

Your benchmark is wrong because you're building a new transducer chain on each run.

const inc = x => x + 1;

const isEven = x => x % 2 === 0;

// simplest, shortest way I would be comfortable with if performance wasn't an issue

const mapFilter = xs => xs.filter(isEven).map(inc);

// transducers way

// function composition
const compose = (...fns) => x => fns.reduceRight((y, f) => f(y), x);

const map = f => step => (a, c) => step(a, f(c));
const filter = p => step => (a, c) => (p(c) ? step(a, c) : a);

// basic reducer for building array
const build = (acc, x) => {
  acc.push(x);
  return acc;
};

// transducer, it doesn't create intermediate arrays hence should theoretically be faster
const reducer = compose(filter(isEven), map(inc))(build);
const transducers = xs => xs.reduce(reducer, []);

// native loop for comparison
const nativeLoop = data => {
  const result = [];
  const l = data.length;
  for (let i = 0; i < l; i++) {
    const x = data[i];
    if (isEven(x)) result.push(inc(x));
  }
  return result;
};

const data = Array(1000).fill(1);

const base = ["simplest, chained map and filter", () => mapFilter(data)];
const alternative = ["composed transducers", () => transducers(data)];
const alternative2 = ["native loop", () => nativeLoop(data)];

/* console.log(Benchmark) */
console.log("Running benchmarks....");

const suite = new Benchmark.Suite();
suite
  .add(...base)
  .add(...alternative)
  .add(...alternative2)
  .on("cycle", function(event) {
  console.log(String(event.target));
})
  .on("complete", function() {
  console.log("Fastest is " + this.filter("fastest").map("name").join(", "));
})
// run async
  .run({ async: true });
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.15/lodash.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/benchmark/2.1.4/benchmark.min.js"></script>

As you can see, transducers are indeed faster than chained map and filter methods.

Aadit M Shah
  • 72,912
  • 30
  • 168
  • 299
1

The benchmark is flawed. the reducer doesn't have to do any work.

  • create a uniform array with the odd number 1.
  • then run an isEven function over each element
  • always going to return an empty array

We are benchmarking the performance of returning an empty array.

if we prefill an array with real data, the native method will win. Aadit is correct though, his transducer is the fastest of the two transducer implementations.

const data = [];

for (let i = 0; i < 1000; i++) {
    data.push(Math.floor(Math.random() * 10));
}
James Laux
  • 41
  • 2