From Wikipedia on Metacompilation:
Metacompilation is a computation which involves metasystem transitions
(MST) from a computing machine M to a metamachine M' which controls,
analyzes and imitates the work of M. Semantics-based program
transformation, such as partial evaluation and supercompilation (SCP),
is metacomputation.
More about Metasystems on Wikipedia.
I am not knowledgeable on the subject, but I'll give my understanding of the description. Say we had a simple program that could copy stdin to stdout. This would be our computing machine M. Our metamachine M' is a second program that takes the source of M as input (or is otherwise constructed to inherently know of M) and is therefore able to understand not only what M does, but how it does so.
If my understanding is correct, then the obvious question is why do we care about M'? What comes to my mind is automatic optimisations. If we can understand both how M works and what M is trying to accomplish, M' may solve ways to improve the operation of M, either in space or time. Furthermore, and importantly, M' may substitute M since M' can accomplish whatever M did. This means that M'' can improve on the ways M' optimised M, and subsequently replace M', and so on.