1

I had initially inquired about why .h files are necessary for C++. I had a question regarding why not just do #include /file.cpp instead (where file.cpp would contain function/class declarations and definitions), but then I found this link which stated that doing this would surmount to essentially copying all code from the file.cpp. So if I'm understanding this right:

  1. The confusion/error comes from linking. So if I just compiled the main file #include /file.cpp, but not the actual file.cpp itself, wouldn't I prevent this error from occuring? Or is it such that you automatically compile AND link anything that you #include?
  2. If it is possible, then is the reason why we don't do this because in large projects, it is easier to compile a .h file that has been listed to #include instead of a .cpp file, even though the contents of the .cpp file will be called many times (ie. the .cpp file only has to be compiled once)?
bayesianpower
  • 93
  • 2
  • 8
  • some projects do have a single compilation unit for faster compilation time. [#include all .cpp files into a single compilation unit?](https://stackoverflow.com/q/543697/995714), [Why not concatenate C source files before compilation?](https://stackoverflow.com/q/42135503/995714), [The benefits / disadvantages of unity builds?](https://stackoverflow.com/q/847974/995714) – phuclv Apr 05 '20 at 15:18
  • We don't compile header file (*.h) (ignoring pch). – Jarod42 Apr 05 '20 at 15:27
  • Convention is to have translation unit as .cpp (or .cxx, .cc). and give them to build system. So including them would produce multiple definitions. – Jarod42 Apr 05 '20 at 15:35

3 Answers3

2

Or is it such that you automatically compile AND link anything that you #include?

An #include simply copy pastes the file in place. It has nothing to do with linking.

So if you #include a file, you wouldn't need to compile it on their own.

If it is possible, then is the reason why we don't do this because in large projects, it is easier to compile a .h file that has been listed to #include instead of a .cpp file

Header files are simply files that are meant to be #included, nothing less, nothing more. So they are not compiled on their own.


Header files are used because when you have multiple compiled files (translation units), you have to have some way to share functions, classes, etc. from one to the others. So a header file is created with whatever information is needed for other compiled files to use.

For instance, if you have a function f defined in a.cpp, you need a declaration in other files b.cpp, c.cpp, etc. Instead of repeating the declaration in every file (repeating yourself is hardly useful!), people create a header file and #include it.

So header files are not strictly necessary in C++, but since nobody wants to repeat themselves, we use them.

And you can definitely compile an entire program as a single translation unit, possibly with headers, if you want. But most projects don't in order to speed up compilation when changes are small.

Acorn
  • 24,970
  • 5
  • 40
  • 69
1

So if I just compiled the main file #include /file.cpp, but not the actual file.cpp itself, wouldn't I prevent this error from occuring?

Yes. And in a world where you only have two files, that's fine. The moment you have multiple files which include stuff from multiple other files, this starts to break down. At the very least, you would have to make it so that your source files constitute a directed, acyclic graph of inclusions, so that there could be a single, well-defined order of includes that works everywhere.

Now, it is possible to only compile a single .cpp file which itself just includes all of the source files. These are called "unity builds". But you don't generally develop that way. This is mainly done mainly for optimization reasons, because link-time optimization is not nearly as capable as compile-time optimization.

And there are dangers to such a build style. Source files can use static or unnamed namespace-scoped declarations to create private definitions that are only visible within that source file. This is done mainly to avoid name conflicts, allowing the source file to use a name that it knows external code will never see. Once you start including entire source files, those definitions get included too, and therefore they can start conflicting. You have to code in a specific style to avoid these kinds of conflicts. So you can't just take any two source files and include them and expect it to work.

And then there are basic compile-time reasons. If you have a 10,000 source file project, and you're including all of those source files into a single source file that gets compiled, any change to any one source file provokes a complete recompile of the entire project. That's not really something you want to do during active development; unity builds are typically used for delivering the final executable. Also, it's difficult for such builds to take advantage of threading compilation, since you can't compile individual "source files" independently from one another when all the compiler sees is a single blob of text.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
0
  1. Yes if your main file included "file.cpp" and you didn't compile that file anywhere else or include it anywhere else then that would work. But what exactly is the point? As far as the compiler is concerned you have only one file, so why not have only one file in reality?

  2. There's code that can be only compiled once and there's code that it's OK to compile many times. Very roughly speaking definitions must only be compiled once, but declarations can be compiled as many times as you like. So the convention is that we put the definitions in cpp files and pass them to the compiler and we put the declarations in header files and include them instead of directly compiling them. The reason we do it is that it's a sensible convention that allows us to organise code such a way that we can compile different parts of our program separately, thus improving build times.

john
  • 85,011
  • 4
  • 57
  • 81