11

When I read the Church Rosser II Theorem here

Theorem (Church Rosser II)

If there is one terminating reduction, then outermost reduction will terminate, too.

I'm wondering: Is there some theorem which strengthens the Church Rosser II Theorem so that it tells about the asymptotic time complexity instead of termination?

Or, can it be proved that the call-by-need strategy has the minimal asymptotic time complexity among all reduction strategies?

Community
  • 1
  • 1
luochen1990
  • 3,689
  • 1
  • 22
  • 37
  • Why go for asymptotic complexity? Intuitively, call-by-need should have the minimal complexity overall, in absolute numbers. – Joachim Breitner Feb 22 '17 at 15:58
  • 1
    I'm pretty sure call-by-need is not optimal under any reasonable cost model, assuming it is understood to mean not reducing inside the body of a lambda (which could get applied many times). It's not so obvious to me exactly how to make a case where it is asymptotically non-optimal, though. – Reid Barton Feb 22 '17 at 16:28
  • I think that tricky part in this proof is "among all reduction strategies". Does global memoization counts as evaluation strategy? If yes what complexity it has? – Luka Rahne Feb 22 '17 at 22:41

2 Answers2

4

Certainly not. Consider

f x y = sum [1..x] + y
g x = sum (map (f x) [1..x])

Call-by-need reduction of g x will perform O(x^2) additions, but really only O(x) are necessary. For example, lazy HNF will get us this complexity.

-- The definition f3 will get lazily refined.
let f3 y = f 3 y = sum [1,2,3] + y

g 3 = sum (map f3 [1,2,3])
    = sum [f3 1, f3 2, f3 3]

-- Now let's refine f3 to HNF *before* applying it
f3 y = sum [1,2,3] + y
     = 6 + y

-- And continue g 3
g 3 = sum [f3 1, f3 2, f3 3]
    -- no need to compute sum [1..x] three times
    = sum [6 + 1, 6 + 2, 6 + 3]
    = ...

I handwaved the evaluation order quite a bit here, but hopefully you get the idea. We reduce a function body to HNF before applying it, thus sharing any computations that don't depend on the argument.

For a whole lot more about this, see Michael Jonathan Thyer's Lazy Specialization.

luqui
  • 59,485
  • 12
  • 145
  • 204
1

I think your question is a bit confused. Please, let me clarify a few points.

First of all the theorem you mention is traditionally known as standardization theorem, and is due to Curry and Feys (Combinatory Logic I, 1958), generalized to eta reduction by Hindley (Standard and normal reductions), and then revised by many other authors (see e.g. this question ).

Secondly, outermost reduction is different from call by need (call by need is not even a reduction strategy in the traditional sense of the word).

Coming to the complexity issue, that is the core of the question, call by need is optimal only with respect to weak reduction. However, weak reduction is not always the best way for reducing lambda terms. A well know example is the following term

                                n 2 I I

where n and 2 are church integers, and I is an identity. I add two I at the end, otherwise weak-reduction languages would stop the computation too early.

Observe that the term reduces to

                          2 (2 (... (2 I))..) I

and (2 I) reduces to I. So, with innermost reduction you would be able to reduce the term in linear time w.r.t n.

On the other side, I invite you to perform the previous computation in Haskell, and you will discover that the reduction time grows exponentially in n. I leave to you to understand the reasons of this phenomenon.

A similar discussion has been done in this thread.

Community
  • 1
  • 1
Andrea Asperti
  • 885
  • 10
  • 14