5

Python, and others, use the import technique of getting external functionality.

C, and others, use include (and, eg C++, has attendant namespace headaches).

What is the reason to pick one over the other (or use both like Objective-C can) in designing a language?

I see Apple is proposing some updates/changes via a paper to LLVM, and wonder why the differences exist.

clarification based on @delnan's answer

Given that there are multiple ways to implement import (of which I was unaware until his answer), what is the overall benefit of !include versus include? The import technique seems to only find individual subcomponents based on pathing given to them (at least in Python - whose [apparent] method is the only one I know).

How do other uses of the import methodology differ yet from that? When would using the 'old style' include method make sense in modern language design and implementation (if ever)?

Community
  • 1
  • 1
warren
  • 32,620
  • 21
  • 85
  • 124
  • -1 because you have not identified what you consider to be the differences between these approaches, or defined what they mean. – Marcin Nov 26 '12 at 18:52
  • 1
    @Marcin - I am asking for the differences .. as I do not know, that is the question. The downvote is an obvious misunderstanding of the question asked. – warren Nov 26 '12 at 18:54
  • No, I understand your question, in so far as it is defined. The point I am making is that your question is insufficiently defined because you do not describe the approaches you are requesting people to comment on. What do you think defines "the import approach"? What do you think defines "the include approach"? Were your complaint well-made, delnan's answer would be poor, but in fact it is the best answer you can ever have, given the problems with your question. – Marcin Nov 26 '12 at 18:55
  • 1
    @Marcin It's true that the "import approach" is not know terminology. The "include approach", however, seems pretty clear to me: It's whatever C does with the `#include` preprocessor directive, plus associated idioms. –  Nov 26 '12 at 18:59
  • @delnan But what are the boundaries of that? Given that `namespace` is mentioned, it presumably goes beyond mere macro-processing; does it include module linking techniques, or is that out of scope? There's far too much ambiguity here. It sounds like a question posed at university to a student who doesn't understand it. – Marcin Nov 26 '12 at 19:02
  • @Marcin - I haven't been a student for more than half a decade now: and while yes, I do not understand the intricacies (see clarification), the question is still one I have. – warren Nov 26 '12 at 19:05
  • You've managed to make this even less clear. – Marcin Nov 26 '12 at 19:06
  • 2
    I wouldn't phrase it that harshly, but you do turn one question into several more-or-less independent ones, so I think it'd make more sense to call it a day here and ask these questions separately, after some research (in particular, look at other module systems -- yet another excuse to learn more programming languages, which is a good idea anyway!). –  Nov 26 '12 at 19:09

1 Answers1

11

The approach of C, which C++ and Objective C simply inherited, is very simple to define and implement (in a nutshell, "when encountering an #include, replace it with the contents of the file, and continue"), but has serious problems. Some of these problems are named in the presentation you've seen (and elsewhere). There are idioms and best practices (also discussed in that presentation and elsewhere) and minor extensions (#pragma once, precompiled headers) which alleviate some of the problems, but at the end of the day, the approach is fundamentally too limited to handle what software engineers have come to expect of a module system. Pretending it does what more recent alternatives do (see below) is a quite leaky abstraction.

Nowadays, everyone with an opinion on language design seems to agree that you shouldn't do it if you can help it. C++ and Objective C didn't have that choice due to the need for backwards compatibility (though both had and still have the choice to add another mechanism, and Objective C did it). It is "fair for it's day", in that it was a rather good decision back when it was made (it worked well enough, and it still kinda works if you have discipline), but the world has moved on and settled on better ways to split code into modules, then pull it back together. (Note that such ways already existed back in the early C days, but apparently they didn't catch on for a while.)

What you describe as "the" import technique is actually a pretty large design space. Many module systems are almost, but not quite, entirely unlike each other - and the rest still has enough subtle differences to ruin your day. It can be anything from just executing the imported file in a new scope (Python, PHP) to fully-blown ML-style functors. There are some similarities, in that all these module systems give each "module" (whatever that means in the respective system) their own scope/namespace, (usually) permit separate compilation of modules, and generally go out of their way to fix the problems of C-style textual includes (or whatever other problem the creator sees with the alternatives). That's about as much as one can say in general.

  • wrt to the `"the" import technique`, I was not aware there were more than one. – warren Nov 26 '12 at 19:00
  • 1
    With "Import" as used in Modula-2 (and "With/Use" as in Ada) you can not just bypass the obvious problems with #include, but you can also achieve namespace control; either importing all of a module and accessing it by qualified name (module.function) or importing only what you need. Your compiler can also use the import lists to properly track your program's dependencies, and avoid the whole Makefile mess... –  Dec 11 '12 at 17:05