In the OP's example, the generator will always be slower
While JS engine authors will continue working to improve performance, there are some underlying structural realities that virtually guarantee that, for the OP's test case, building the array will always be faster than using an iterator.
Consider that a generator function returns a generator object.
A generator object will, by definition, have a next()
function, and calling a function in Javascript means adding an entry to your call stack. While this is fast, it's likely never going to be as fast as direct property access. (At least, not until the singularity.)
So if you are going to iterate over every single element in a collection, then a for loop over a simple array, which accesses elements via direct property access, is always going to be faster than a for loop over an iterator, which accesses each element via a call to the next()
function.
As I'm writing this in January of 2022, running Chrome 97, the generator function is 60% slower than the array function using the OP's example.
Performance is use-case-dependent
It's not difficult to imagine scenarios where the generator would be faster. The major downside to the array function is that it must build the entire collection before the code can start iterating over the elements, whether or not you need all the elements.
Consider a basic search operation which will only access, on average, half the elements of the collection. In this scenario, the array function exposes its "Achilles' heel": it must build an array with all the results, even though half will never be seen. This is where a generator has the potential to far outstrip the array function.
To demonstrate this, I slightly modified the OP's use-case. I made the elements of the array slightly more expensive to calculate (with a little division and square root logic) and modified the loop to terminate at about the halfway mark (to mimic a basic search).
Setup
function getNumberArray(maxValue) {
const a = [];
for (let i = 0; i < maxValue; i++) {
const half = i / 2;
const double = half * 2;
const root = Math.sqrt(double);
const square = Math.round(root * root);
a.push(square);
}
return a;
}
function* getNumberGenerator(maxValue) {
for (let i = 0; i < maxValue; i++) {
const half = i / 2;
const double = half * 2;
const root = Math.sqrt(double);
const square = Math.round(root * root);
yield square;
}
}
let dummyCalculation;
const numIterations = 99999;
const searchNumber = numIterations / 2;
Generator
const iterator = getNumberGenerator(numIterations);
for (let val of iterator) {
dummyCalculation = numIterations - val;
if (val > searchNumber) break;
}
Array
const iterator = getNumberArray(numIterations);
for (let val of iterator) {
dummyCalculation = numIterations - val;
if (val > searchNumber) break;
}
With this code, the two approaches are neck-and-neck. After repeated test runs, the generator and array functions trade first and second place. It's not difficult to imagine that if the elements of the array were even more expensive to calculate (for example, cloning a complex object, making a REST callout, etc), then the generator would win easily.
Considerations beyond performance
While recognizing that the OP's question is specifically about performance, I think it's worth calling out that generator functions were not primarily developed as a faster alternative to looping over arrays.
Memory efficiency
The OP has already acknowledged this, but memory efficiency is one of the main benefits that generators provide over building arrays. Generators can build objects on the fly and then discard them when they are no longer needed. In its most ideal implementation, a generator need only hold one object in memory at a time, while an array must hold all of them simultaneously.
For a very memory-intensive collection, a generator would allow the system to build objects as they are needed and then reclaim that memory when the calling code moves on to the next element.
Representation of non-static collections
Generators don't have to resolve the entire collection, which free them up to represent collections that might not exist entirely in memory.
For example, a generator can represent collections where the logic to fetch the "next" item is time-consuming (such as paging over the results of a database query, where items are fetched in batches) or state-dependent (such as iterating over a collection where operations on the current item affect which item is "next") or even infinite series (such as a fractal function, random number generator or a generator returning all the digits of π). These are scenarios where building an array would be either impractical or impossible.
One could imagine a generator that returns procedurally generated level data for a game based on a seed number, or even to represent a theoretical AI's "stream of consciousness" (for example, playing a word association game). These are interesting scenarios that would not be possible to represent using a standard array or list, but where a loop structure might feel more natural in code.