0

Stack safe recursion sometimes produces deferred nested function call trees, which, once evaluated may exhaust the stack:

const compn = fs => // non-recursive to keep it simple
  fs.reduce((f, g) => comp(f) (g), x => x);

const compn_ = fs => x => // non-recursive to keep it simple
  fs.reduce((x, f) => f(x), x);

const comp = f => g => x => f(g(x));

const inc = x => x + 1;

const fs = Array(1e5).fill(inc);

try {compn(fs) (0)}
catch (e) {
  console.log(e.message);
}

console.log(compn_(fs) (0));

We can avoid building such structures altogether (compn_), yet I wonder if there is a less limited way to deal with them:

const rec = f => (...args) => {
  let step = f(...args);
  const stack = [];

  while (step.tag !== Base) {
    stack.push(step.f);
    step = f(...step.step.args);
  }

  return x => {
    for (let i = stack.length - 1; i >= 0; i--) {
      x = stack[i] (x);

      if (x && x.tag === Base) {
        x = x.x;
        break;
      }
    }

    return x;
  }
};

const Base = x =>
  ({tag: Base, x});

const Call = (f, step) =>
  ({tag: Call, f, step});

const Step = (...args) =>
  ({tag: Step, args});

const id = x => x;

const comp = f => g => x => f(g(x));

const inc = x => x + 1;

const fold = f => acc => xs =>
  rec(i =>
    i === xs.length
      ? Base(acc)
      : Call(f(xs[i]) (id), Step(i + 1))) (0);
//                     ^^ works but is nonsensical

const fs = Array(1e5).fill(inc);

console.log(
  fold(f => g => comp(f) (g))
    (id)
      (fs)
        (0)); // 100000

This works but applying id to each recursive step defeats the purpose of using comp in the first place. Is there a way to work with such deferred tree structures in Javascript or do we have to fall back to linear, stack-like structures?

  • https://stackoverflow.com/questions/310974/what-is-tail-call-optimization – QuentinUK Mar 13 '20 at 10:35
  • What do you mean by a “less limited way”? In what sense is `compn_` limited? – Aadit M Shah Mar 14 '20 at 03:22
  • Well, `compn_` is eager and it doesn't necessarily produces a recursive data structure, that is to say a better wording would have been _`compn_` is less natural for FP_. Anyway, these data structures seem to be the last scenario where I cannot trade stack for heap. Working with such structures seem beneficial in some situations though, so others solved the problem by [generalizing the trampoline monad](http://blog.higher-order.com/assets/trampolines.pdf), as far as I can say. Unfortunately I cannot reproduce this solution, because my ability to decipher Scala code is limited. –  Mar 14 '20 at 07:23
  • In what sense is the `compn_` function eager? Also, in what sense does the `compn_` function not produce a recursive data structure? As I see it, `∀(fs). compn_(fs) ≡ compn(fs)` modulo stack safety. Hence, if `compn_` is not producing a recursive data structure then neither is `compn`. Finally, even if it is not producing a recursive data structure, how is that “less natural for FP”? – Aadit M Shah Mar 14 '20 at 10:19
  • 1
    By the way, in the paper that you linked to the author uses free monads to build an AST of the computation, instead of building the actual computation. By doing so he trades the program stack for the heap. He then creates a stack safe interpreter for the AST. Because the size of the heap is much greater than the size of the program stack, you don't get a stack overflow. In your particular case, `compn` results in a stack overflow because it creates a large computation. However, `compn_` doesn't result in a stack overflow because it trades the stack for the heap by using the list representation – Aadit M Shah Mar 14 '20 at 10:32
  • `compn` is lazy because `comp` is lazy, provided we use the term lazyness in a broader sense and not just for WHNF/thunks. As you know you can think of `comp` as passing an argument `g` to `f` that is only evaluated when needed. Since `compn` is lazy it creates a deferred function call tree, a description of a computation that is only performed when the missing argument is provided. So both functions are different both in terms of their temporal behavior and in their intermediate steps to produce a result. –  Mar 14 '20 at 11:21
  • No, `comp` is not lazy. It doesn't apply `f` to `g`. It applies `f` to `g(x)`. Since JavaScript is strict, `g(x)` is evaluated before the function `f` has even been called. Hence, `g(x)` is not only evaluated when needed, but also evaluated when not needed. Here's a simple [demo](https://jsfiddle.net/qu3vb8kt/). Hence, `compn` is not creating a deferred function call tree as you claim. Finally, both `compn` and `compn_` return a function. The `.reduce` in `compn_` is inside an arrow function. Hence, in that sense `compn_` is “deferred” too. – Aadit M Shah Mar 14 '20 at 13:52

1 Answers1

0

One way to make comp(inc) (comp(inc) (comp(inc) (..))) stack safe is to defer the execution even further by reifying the continuation:

const Cont = k => ({tag: "Cont", k});

const compk = f => g => x =>
  Cont(k => k(f(g(x))));

const compn = fs =>
  fs.reduce((f, g) => compk(f) (g), x => x);

const id = x => x;

const inc = x => x + 1;

const postRec = cont => {
  do {
    cont = cont.k(id);
  } while (cont && cont.tag === "Cont")

  return cont;
};

const fs = Array(1e5).fill(inc);

const main = compn(fs) (0);

console.log(
  postRec(main));

Now the processing of each nested level of the structure is deferred and is performed within a trampoline. I call this trampoline postRec, since it is run after the original recursive algorithm already finished its work. I don't know if this approach is general enoug to be useful though.