0

I know there are similar posts, but I feel like my questions are specific enough to still ask them. Let's say I opt to build an executable by putting actual definitions in header files, and then just including them all in a single cpp file.

If I understand correctly, the implications of that only affect the build process, i.e. compilation and linkage, and have no impact on runtime performance. Is it true?

Moreover, compilation times should be higher, while linking times should be lower. Shouldn't that even it out? And if it doesn't, are increased compilation times truly the only reason nobody builds an application like that?

sepp2k
  • 363,768
  • 54
  • 674
  • 675
  • 2
    The actual problem is that you'll have to rebuild your _whole_ code every time, even after the tiniest change to it. For big projects that can be hours (try building Clang from source, for example). If you build object files first, or libraries, then you'd only need to rebuild the files of that particular library or your binary. – ForceBru Apr 21 '19 at 22:34
  • Linking is blazing fast compared to compiling, and both are faster than working in a project that is a literal mess. Your proposal is the opposite of what the language and the tools surrounding it, including the compiler, were designed to work properly with, and there is no version of this that can go right. – Havenard Apr 21 '19 at 22:38
  • Actually, it can make a large difference on runtime performance, so large that toolchains provide an option to do this ("Whole Program Optimization") even when invoked on a bunch of separate compilation units. See the linked questions. – Ben Voigt Apr 21 '19 at 22:46
  • @BenVoigt Even so you can just use `#include` instead. – Havenard Apr 21 '19 at 22:49
  • @Havenard: Only if the code is structured not to have naming collisions at file scope. – Ben Voigt Apr 21 '19 at 22:56

1 Answers1

1

A short TL;DR of the compilation process. The compiler firstly runs the preprocessor that executes the preprocessor directives (i.e. lines with a hash at the beginning, such as the includes).

The include directive simply copies the content of a file inside another. So, moving stuff in the included files (conventionally, header files), does not change much in this regard. We simply take out the definitions because each translation unit (think of it as a .cpp after the preprocessor has ran, with the included files inserted) must have only one definition of each thing. Having definitions in headers makes this hard to avoid.

So, the compilation of one translation unit is more or less the same. The problem, besides the multiple definition thing, is that when a header is modified, all other files that include it must be recompiled. If it's a widely used header, this may take a while. On the other hand, if a .cpp file is modified, only it is recompiled.

Since the preprocessor basically does text manipulation, the executable should be the same.

Paul92
  • 8,827
  • 1
  • 23
  • 37