3

The question is pretty simple: is there any way to call Array methods like filter, find, map, etc. not only on arrays, but on any iterable?

filter, find, map etc make sense not only on an array, but generally on a sequence. And iterable is a sequence that can be oterated over, so it makes sense to filter a sequence, find (the first element in a sequence), map elements of a sequence... whatever the sequence is.

Imagine such case: an infinite generator (such as fibonacci sequence, generator returns one item at a time). I want to find the first element that satisfies a given condition. Using spread like this:

[...fib()].find(conditionFunction)

will first make fib sequence to be dumped, which results in crashing the browser because of memory consumption (infinite sequence). What I could do is to manually call for loop and use conditionFunction inside.

Is there any way to call filter, find, map, etc lazily on (non-array) iterables?

ducin
  • 25,621
  • 41
  • 157
  • 256
  • 1
    Does this answer your question? [Why do generators not support map()?](//stackoverflow.com/q/31232415/90527) – outis May 25 '22 at 21:32

2 Answers2

3

Unfortunately, iterator methods like find are implemented using sequence protocol (.length + Get), not with iterator protocol. You can try fooling them with a proxy that will make iterables impersonate sequences, e.g.

let asArray = iterable => new Proxy(iterable, {

    get(target, prop, receiver) {
        if(prop === 'length')
            return Number.MAX_SAFE_INTEGER;
        return target.next().value;
    }
});


function *fib() {
    let [a, b] = [1, 1];

    while (1) {
        yield b;
        [a, b] = [b, a + b];
    }
}

found = [].find.call(
    asArray(fib()),
    x => x > 500);

console.log(found);

Requires some more work, but you get the idea.

Another (and IMO much cleaner way) would be to reimplement iterator methods to support iterables (and to be generators themselves). Luckily, this is pretty trivial:

function *lazyMap(iter, fn) {
    for (let x of iter)
        yield fn(x);
}


for (let x of lazyMap(fib(), x => x + ' hey'))...

And here's how one can make a lazy iterator object with chainable methods:

let iter = function (it) {
    return new _iter(it);
};

let _iter = function(it) {
    this.it = it;
};

_iter.prototype[Symbol.iterator] = function *() {
    for (let x of this.it) {
        yield x;
    }
};

_iter.prototype.map = function (fn) {
    let _it = this.it;
    return iter((function *() {
        for (let x of _it) {
            yield fn(x)
        }
    })())
};

_iter.prototype.take = function (n) {
    let _it = this.it;
    return iter((function *() {
        for (let x of _it) {
            yield x;
            if (!--n)
                break;
        }
    })())
};

// @TODO: filter, find, takeWhile, dropWhile etc

// example:


// endless fibonacci generator
function *fib() {
    let [a, b] = [1, 1];

    while (1) {
        yield b;
        [a, b] = [b, a + b];
    }
}

// get first 10 fibs, multiplied by 11
a =  iter(fib())
     .map(x => x * 11)
     .take(10)

console.log([...a])
georg
  • 211,518
  • 52
  • 313
  • 390
  • don't get such accurate answers often, thanks!!! PS Do you think it'd make sense to re-implement filter/find/map/etc. onto iterables, in order to create a true pipeline processing, no matter what structure (and how) you iterate over? or is it just me...? – ducin Jun 02 '17 at 18:04
  • @ducin: yes, makes sense definitely, see an update for a possible approach. – georg Jun 02 '17 at 19:15
0

Using iter-ops library (I'm the author):

import {pipe, first} from 'iter-ops';

const i = pipe(
    fib(),
    first((value, index) => {
        // return a truthy value here when found what you need
    })
); // IterableExt<number>

console.log('found:', i.first);
vitaly-t
  • 24,279
  • 15
  • 116
  • 138