20

I have a generator called generateNumbers in JavaScript and another generator generateLargerNumbers which takes each value generated by generateNumbers and applies a function addOne to it, as such:

function addOne(value) {
  return value + 1
}

function* generateNumbers() {
  yield 1
  yield 2
  yield 3
}

function* generateLargerNumbers() {
  for (const number of generateNumbers()) {
    yield addOne(number)
  }
}

Is there any terser way to do this without building an array out of the generated values? I'm thinking something like:

function* generateLargerNumbers() {
  yield* generateNumbers().map(addOne) // obviously doesn't work
}
4castle
  • 32,613
  • 11
  • 69
  • 106
damd
  • 6,116
  • 7
  • 48
  • 77
  • please add an example. – Nina Scholz Aug 31 '17 at 14:28
  • 3
    Sure, you can do just that. Either [with a `.map` helper method](https://stackoverflow.com/a/31232620/1048572) or as [a standalone function](https://stackoverflow.com/a/45735702/1048572). – Bergi Aug 31 '17 at 14:30
  • @NinaScholz: Fixed the question up with a better example – damd Aug 31 '17 at 14:31
  • Does this answer your question? "[Why do generators not support map()?](//stackoverflow.com/q/31232415/90527)", "[calling filter, find (array methods) lazily on all iterables](//stackoverflow.com/q/44326481/90527)" – outis May 25 '22 at 21:49

4 Answers4

19

higher-order generators

You can choose to manipulate the generator functions themselves

const Generator =
  {
    map: (f,g) => function* (...args)
      {
        for (const x of g (...args))
          yield f (x)
      },
    filter: (f,g) => function* (...args)
      {
        for (const x of g (...args))
          if (f (x))
            yield x
      }
  }

// some functions !
const square = x =>
  x * x

const isEven = x =>
  (x & 1) === 0
  
// a generator !
const range = function* (x = 0, y = 1)
  {
    while (x < y)
      yield x++
  }

// higher order generator !
for (const x of Generator.map (square, Generator.filter (isEven, range)) (0,10))
  console.log('evens squared', x)

higher-order iterators

Or you can choose to manipulate iterators

const Iterator =
  {
    map: (f, it) => function* ()
      {
        for (const x of it)
          yield f (x)
      } (),
    filter: (f, it) => function* ()
      {
        for (const x of it)
          if (f (x))
            yield x
      } ()
  }

// some functions !
const square = x =>
  x * x
  
const isEven = x =>
  (x & 1) === 0

// a generator !
const range = function* (x = 0, y = 1)
  {
    while (x < y)
      yield x++
  }
  
// higher-order iterators !
for (const x of Iterator.map (square, Iterator.filter (isEven, range (0, 10))))
  console.log('evens squared', x)

recommendation

In most cases, I think it's more practical to manipulate the iterator because of it's well-defined (albeit kludgy) interface. It allows you to do something like

Iterator.map (square, Iterator.filter (isEven, [10,11,12,13]))

Whereas the other approach is

Generator.map (square, Generator.filter (isEven, Array.from)) ([10,11,12,13])

Both have a use-case, but I find the former much nicer than the latter


persistent iterators

JavaScript's stateful iterators annoy me – each subsequent call to .next alters the internal state irreversibly.

But! there's nothing stopping you from making your own iterators tho and then creating an adapter to plug into JavaScript's stack-safe generator mechanism

If this interests you, you might like some of the other accompanying examples found here: Loop to a filesystem structure in my object to get all the files

The only gain isn't that we can reuse a persistent iterator, it's that with this implementation, subsequent reads are even faster than the first because of memoisation – score: JavaScript 0, Persistent Iterators 2

// -------------------------------------------------------------------
const Memo = (f, memo) => () =>
  memo === undefined
    ? (memo = f (), memo)
    : memo

// -------------------------------------------------------------------
const Yield = (value, next = Return) =>
  ({ done: false, value, next: Memo (next) })
  
const Return = value =>
  ({ done: true, value })
  
// -------------------------------------------------------------------
const MappedIterator = (f, it = Return ()) =>
  it.done
    ? Return ()
    : Yield (f (it.value), () => MappedIterator (f, it.next ()))
    
const FilteredIterator = (f, it = Return ()) =>
  it.done
    ? Return ()
    : f (it.value)
      ? Yield (it.value, () => FilteredIterator (f, it.next ()))
      : FilteredIterator (f, it.next ())

// -------------------------------------------------------------------
const Generator = function* (it = Return ())
  {
    while (it.done === false)
      (yield it.value, it = it.next ())
    return it.value
  }

// -------------------------------------------------------------------  
const Range = (x = 0, y = 1) =>
  x < y
    ? Yield (x, () => Range (x + 1, y))
    : Return ()

const square = x =>
  x * x

const isEven = x =>
  (x & 1) === 0

// -------------------------------------------------------------------
for (const x of Generator (MappedIterator (square, FilteredIterator (isEven, Range (0,10)))))
  console.log ('evens squared', x)
Mulan
  • 129,518
  • 31
  • 228
  • 259
  • 2
    Both your "higher order" iterator and your persistent iterator (very nice btw) indicate the shortcomings of the imperative iterator concept. An iterator isn't a function, hence you have to use a function wrapper to make it composable. But an iterator isn't a data type either, therefore you have to memoize the getter computation in order to reuse it. As a result an iterator is a stateful getter object, tightly coupled to its data source. Or as I'd express it: It is the worst conceivable starting point for an iterative algorithm. Transducers/short circuiting are the better choice. –  Sep 01 '17 at 15:23
  • @ftor an iterator *could* be a function; it's all dependent on how it's implemented; function composition is not the only means of combination – also, the memoisation is an optimisation, not a requirement; in javascript, only a thunk is required to delay the evaluation – iterators/streams are not interchangeable with transducers in all cases; eg, an transducer couldn't be used on an infinite collection like `listOfPrimeNumbers` – Mulan Sep 04 '17 at 18:46
  • ... see [this recent answer](https://stackoverflow.com/a/45951461/633183) - it's written in scheme but shows the same program written in two different ways: once with streams (iterators), and again with transducers - the *only* difference here is that scheme allows you to create a special form for the stream constructor where the second argument is automatically wrapped in a thunk; in js, we must explicitly wrap in a thunk - note how the contents of the list needs to be known before running the transducer program, but only the starting number (`2`) is needed to begin the streams program – Mulan Sep 04 '17 at 18:51
  • ... [Loop to a filesystem structure in my object to get all the files](https://stackoverflow.com/questions/45565900/loop-to-a-filesystem-structure-in-my-object-to-get-all-the-files/45585629#45585629) shows that iterators *can* be combined in meaningful ways – of particular importance, the `ConcatIterator` concats an iterator with itself, demonstrating that the same iterator can be consumed multiple times – `MyTreeIterator` showcases a natural, recursive traversal that forks into several iterators, then flattens back down – Mulan Sep 04 '17 at 18:58
  • I think these examples show that there are trade-offs between iterators and transducers, but they are not mutually exclusive; both have compelling use cases and both can be used to express pure functional programs – as always, thanks for the discussion ^_^ – Mulan Sep 04 '17 at 18:59
  • scheme 101 for js users: `(define (add x y) (+ x y))` and `(add 2 3)` are equivalent to javascript's `const add = (x,y) => x + y` and `add (2,3)` respectively – Mulan Sep 04 '17 at 19:04
  • I think an async iterator would be the appropriate implementation of a (data) stream. But you're right, laziness is the key concept to understand iterators and strict evaluation is the reason why people believe in needing them. What does laziness mean in the context of strict eval? First of all, thunks, right! But as soon as iterative algorithms are involved, short circuiting matters too. We can simulate SC by inverting the control over the iteration from the producer to the consumer. –  Sep 05 '17 at 07:19
  • You can certainly remember `foldrk`. A transducer based on `foldrk` can consume infinite structures, provided that there is a suitable consumer (e.g. `take`). There might be use cases where transducers fail but iterators don't, though. Iterators are imperative and lead to imperative code. The main use case for iterators is when you don't care about the end result of an iteration and use a `for/of` statement. Sure, you can transform iterators into combinators or persistant data structures. But by doing so, you deny the original intention of the designer - at least in Javascript. –  Sep 05 '17 at 07:22
  • @ftor, indeed you can fold an infinite data structure using `foldrk`; it makes sense that you would be able to express the computation from either side; still, I don't want you to think that iterators/streams are inherently imperative or somehow lead to imperative code; that's just not the case – please review [this Stream implementation](https://repl.it/Kja1) to see if it affects your perspective – Mulan Sep 06 '17 at 04:11
  • fwiw, this is no more *pure* than the other implementations provided; implementation details like state or coding style (imperative vs functional) are black box and not users' concern – the user is able to express the program with streams utilized in a purely functional way – oh, i did find a clever way to do away with the thunks using this interface, tho – Mulan Sep 06 '17 at 04:12
  • I really messed it up in my comment. You cannot fold right an infinite collection. Thank you for not spitting it in my face! Eventually, we have to deal with black boxes, yes. When it comes to IO, for instance. Maybe it is the same with iterators, because they can express stuff in a lazy way that transducers can't in a strictly evaluated environment. I will keep that in mind. –  Sep 06 '17 at 07:46
  • @ftor, admittedly, I don't know much about folding infinite structures, but I did [find this](https://stackoverflow.com/questions/7396978/left-and-right-folding-over-an-infinite-list) – as they are written in Haskell's Prelude, it seems like `foldr` is the only working solution due to Haskell's lazy evaluation – this could be ported to js with trivial effort, but it does make me wonder if you could implement `foldl` in a way that makes it suitable for infinite structures ! – Mulan Sep 07 '17 at 19:32
9

There isn't a built-in way to map over Generator objects, but you could roll your own function:

const Generator = Object.getPrototypeOf(function* () {});
Generator.prototype.map = function* (mapper, thisArg) {
  for (const val of this) {
    yield mapper.call(thisArg, val);
  }
};

Now you can do:

function generateLargerNumbers() {
  return generateNumbers().map(addOne);
}

const Generator = Object.getPrototypeOf(function* () {});
Generator.prototype.map = function* (mapper, thisArg) {
  for (const val of this) {
    yield mapper.call(thisArg, val);
  }
};

function addOne(value) {
  return value + 1
}

function* generateNumbers() {
  yield 1
  yield 2
  yield 3
}

function generateLargerNumbers() {
  return generateNumbers().map(addOne)
}

console.log(...generateLargerNumbers())
idmean
  • 14,540
  • 9
  • 54
  • 83
4castle
  • 32,613
  • 11
  • 69
  • 106
0

How about composing an iterator object, instead of using the nested generators?

function* generateNumbers(){
 yield 1;   
 yield 2;
 yield 3;
}

function generateGreaterNumbers(){
 return { next(){ var r = this.gen.next(); r.value+=1; return r; }, gen: generateNumbers() };`
}
John Smith
  • 570
  • 2
  • 7
  • 17
  • 2
    Interesting solution, but not necessarily more terse or readable than the alternative :) – damd Aug 31 '17 at 14:56
  • This converts the return value from `undefined` to `NaN`, which might not be wanted – Bergi Aug 31 '17 at 15:21
0

If you need to actually pass values to your generator then you can't do it with for...of, you have to pass each value through

  const mapGenerator = (generatorFunc, mapper) =>
    function*(...args) {
      let gen = generatorFunc(...args),
        i = 0,
        value;
      while (true) {
        const it = gen.next(value);
        if (it.done) return mapper(it.value, i);
        value = yield mapper(it.value, i);
        i++;
      }
    };

function* generator() {
  console.log('generator received', yield 1);
  console.log('generator received', yield 2);
  console.log('generator received', yield 3);
  return 4;
}

const mapGenerator = (generatorFunc, mapper) =>
  function*(...args) {
    let gen = generatorFunc(...args),
      i = 0,
      value;
    while (true) {
      const it = gen.next(value);
      if (it.done) return mapper(it.value, i);
      value = yield mapper(it.value, i);
      i++;
    }
  };

const otherGenerator = mapGenerator(generator, x => x + 1)

const it = otherGenerator();
console.log(
  it.next().value,
  it.next('a').value, 
  it.next('b').value,
  it.next('c').value
);
Jake Coxon
  • 4,958
  • 1
  • 24
  • 15