4

When you include a header file (.h, .hpp, .hh, .hxx extensions) from the C Standard Library or the C++ STL (or even include a Microsoft's big one like windows.h), and use only a certain limited number of functions from it, does the whole file become binary and be copied to the executable file, or just the specific, relevant functions that are being used become binary?

If the whole file is being copied (and that's what I think actually happens, since windows.h has macro-defines to reduce redundancy; WIN32_LEAN_AND_MEAN for instance) - why does it happen this way? Why doesn't the #include preprocessor-command copy only the used functions (and other functions that are being used by these functions in the background)? Isn't it more cost-effective?

If only the relevant functions are being copied - why should we have to do header-includes in the first place? Why don't we have just a huge pool of functions being used and compiled when needed, like the situation in interpreted languages such as PHP (and some cases in Python). If it's just because "C and C++ are older and has been standardized", why not doing it in their new versions? or at least modern compilers allowing it (we've already seen cases where compilers bend the aging language in favor of the modern developer)?

Reflection
  • 1,936
  • 3
  • 21
  • 39

4 Answers4

3

The header file just tells the compiler what types external functions and variables are, defines macros, types etc. Nothing gets copied. Any functions and variables (what are called external symbols) that get referenced in your source file will be linked in during the linker phase.

If you include stdio.h and use printf in your program, the compiler adds printf as an "unresolved" symbol in the object file, and then the linker will try to find a function called printf either in the object files you explicitly link, or in the libraries it is configured to search.

As stated nearby, there is no real difference between #including a file and copying the contents of that file into your source file. If that included file contains function or data definitions (not just declarations), then these DO become part of your object file.

user1338
  • 225
  • 1
  • 6
  • So, why should we have to do header-includes **for standard libraries** in the first place? Why isn't it like a pool of standard functions (like PHP)? Only because of compile-time reasons? – Reflection Nov 16 '13 at 15:12
  • 2
    Once upon a time, compilers were very slow. Also, in C the standard library is no different to any user-written library. Header files are mainly there to tell the compiler what types functions and variables are. Standard library has no special status of its own – user1338 Nov 16 '13 at 15:18
  • @user1338 you stated that "nothing gets copied" but when I looked up the C preprocessor in wiki page, https://en.wikipedia.org/wiki/C_preprocessor#Phases, it states that, "The preprocessor replaces the line #include with the text of the file 'stdio.h', which declares the printf() function among other things."... Do different compilers do things differently? Can you please clarify. Thank you. –  Dec 22 '18 at 05:21
0

The entirety of the header will be compiled (excepting parts protected with #if, #ifdef, etc., to prevent it).

As things were normally done in C, headers contained only declarations not definitions, except of macros. As such, compiling the header mostly put names into the compiler's symbol table, without producing any code.

In C++, a header often contains actual definitions, especially when templates are in use. In this case, a header may contain many inline function definitions along with class definitions and various declarations.

That doesn't mean that the entire content ends up in the final executable though. A modern linker does quite a bit to merge multiple definitions of the same object/function, and eliminate parts that aren't used.

Macros like WIN32_LEAN_AND_MEAN were invented primarily to improve compile times. They normally have little or no effect on the size of the final executable.

Jerry Coffin
  • 476,176
  • 80
  • 629
  • 1,111
0

The header files are not read directly by the compiler, instead it's read by the preprocessor who basically copy it straight into the place of the #include directive.

As most header file doesn't contain any actual code, just declaration and macros, there's really nothing of such header files that is really put into the executable files. All macros and recursive #include directives are resolved by the preprocessor. What is left to the compiler is declarations, type-aliases (typedef declarations) and, well, more declarations. Declarations does not actually generate any code, they are only used by the compiler for its internal state when parsing and generating the actual code.

You might want to read about the C preprocessor.

Some programmer dude
  • 400,186
  • 35
  • 402
  • 621
-1

From my experience, header files have just about nothing to do with the file size, because almost all OS have c or c++ built-in, and thus it would not be necessary to copy the binaries.

The only case where headers have an effect on file size is when it is one you have made. WIN32_LEAN_AND_MEAN is used to exclude services that are almost never used in the header files.

If you would like more information, I also found this other Stack Overflow article.

Community
  • 1
  • 1
KururuMan
  • 13
  • 5