When you include a header file (.h, .hpp, .hh, .hxx extensions) from the C Standard Library or the C++ STL (or even include a Microsoft's big one like windows.h
), and use only a certain limited number of functions from it, does the whole file become binary and be copied to the executable file, or just the specific, relevant functions that are being used become binary?
If the whole file is being copied (and that's what I think actually happens, since windows.h
has macro-defines to reduce redundancy; WIN32_LEAN_AND_MEAN
for instance) - why does it happen this way? Why doesn't the #include
preprocessor-command copy only the used functions (and other functions that are being used by these functions in the background)? Isn't it more cost-effective?
If only the relevant functions are being copied - why should we have to do header-includes in the first place? Why don't we have just a huge pool of functions being used and compiled when needed, like the situation in interpreted languages such as PHP (and some cases in Python). If it's just because "C and C++ are older and has been standardized", why not doing it in their new versions? or at least modern compilers allowing it (we've already seen cases where compilers bend the aging language in favor of the modern developer)?