79

Short and sweet: I've seen several sources talking about "supercompilation". But I have yet to find one single document anywhere on the face of the Internet which describes what this is. Presumably because it seems simple enough to whoever that it isn't even worth explaining.

Does anybody know what this actually is?

Erik Kaplun
  • 37,128
  • 15
  • 99
  • 111
MathematicalOrchid
  • 61,854
  • 19
  • 123
  • 220
  • 1
    http://c2.com/cgi/wiki?SuperCompiler – vulkanino Jan 30 '12 at 16:54
  • 3
    See the presentations on Neil Mitchell's Supero page, in a nutshell the program is evaluated at compile time. Where the program has no runtime data dependencies it can be fully evaluated, otherwise leave a "residual" expression which forms part of the executable. – stephen tetley Jan 30 '12 at 19:02

2 Answers2

58

Supercompilation can be approached as a generalization of partial evaluation. The idea behind partial evaluation is that many parts of a program can be evaluated at compile time, and so should be. Supercompilation extends this, evaluating things that can't be fully done at compile time as well, like turning map f (map g xs) into map (f . g) xs without anything besides the definition of map (At least I think I got partial evaluation right - I've only read much about supercompilation).

Another way to look at it is as a combination of many other optimizations, like deforestation, specialization, and inlining. By acting as if it already knew the inputs to functions and evaluating, it can get a more direct method of calculating the result - it can get rid of intermediate data structures by seeing how they will be used or it can plug in all possible values and then wrap the result in a case, or do something else with its pretend values.

Max Bolingbroke has a number of useful papers on the subject - I recommend the first one, Supercompilation by Evaluation, as an introduction. Section 2 introduces the subject by example and the rest, while a bit difficult to get through, is very informative about the process. Neil Mitchell also has a number of good presentations describing it.

I hope that helps.

gereeter
  • 4,731
  • 1
  • 28
  • 28
  • 2
    I don't get the distinction between super compilation and partial evaluation from this. "Supercompilation extends this, evaluating things that can't be fully done at compile time as well," - this also describes partial evaluation. Partially evaluating a program in general leaves a partially evaluated residual program. Partial evaluation should ideally take full use of all statically known input/arguments in partially evaluating the program. – Guildenstern Oct 31 '14 at 15:42
  • 1
    They certainly overlap, and as partial evaluation gets extended, it certainly becomes supercompilation. I think of the difference as partial evaluation decides in advance which bits to eliminate (e.g. the first argument to map), then goes and eliminates them. Supercompilation bashes everything with the same rules and sees what comes out. – Neil Mitchell Jan 13 '15 at 11:53
  • The link above is broken found it [here](http://research.microsoft.com/en-us/um/people/simonpj/papers/supercompilation/supercomp-by-eval.pdf) – Chris Hanson Jan 13 '15 at 16:02
  • 1
    @ChrisHanson Thanks! The link should be fixed now. – gereeter Jan 19 '15 at 20:50
  • `map f (map g xs)` turns into `map (f . g) xs` when you use church-encoded values... – MaiaVictor Nov 18 '15 at 19:16
  • link to "number of good presentations" should now be https://web.archive.org/web/20190216072136/http://community.haskell.org/~ndm/supero/ –  Apr 25 '22 at 07:25
3

From Wikipedia on Metacompilation:

Metacompilation is a computation which involves metasystem transitions (MST) from a computing machine M to a metamachine M' which controls, analyzes and imitates the work of M. Semantics-based program transformation, such as partial evaluation and supercompilation (SCP), is metacomputation.

More about Metasystems on Wikipedia.

I am not knowledgeable on the subject, but I'll give my understanding of the description. Say we had a simple program that could copy stdin to stdout. This would be our computing machine M. Our metamachine M' is a second program that takes the source of M as input (or is otherwise constructed to inherently know of M) and is therefore able to understand not only what M does, but how it does so.

If my understanding is correct, then the obvious question is why do we care about M'? What comes to my mind is automatic optimisations. If we can understand both how M works and what M is trying to accomplish, M' may solve ways to improve the operation of M, either in space or time. Furthermore, and importantly, M' may substitute M since M' can accomplish whatever M did. This means that M'' can improve on the ways M' optimised M, and subsequently replace M', and so on.

erisco
  • 14,154
  • 2
  • 40
  • 45