-1

I was practicing building Linked List and thought of separating my functions into separate files and decouple everything from the main file.

This is the file structure I came up with

./
    functions
        printlist.cpp
        functionbcd.cpp
    functions.h
    LinkedList.cpp
    Node.h

Header File in LinkedList.cpp

#include "functions.h"
#include <bits/stdc++.h>
using namespace std;

Header Files in functions.h

#include <bits/stdc++.h>
#include "Node.h"

Header Files in "Any Function Implemented".cpp

#include <bits/stdc++.h>
#include "..\functions.h"
using namespace std;

Compile Command

g++ -ggdb -O2 -std=c++14 LinkedList.cpp functions\*.cpp

Now if I keep the structure mentioned above, my compile time it 4-5x more than the structure where I keep and define all the functions in one file along with main.

I am unable to understand this.

And if there is a better way to structure my files and improve the compile time, please do tell.

Thank You.

  • 3
    Please read [Why should I not #include ?](https://stackoverflow.com/questions/31816095/why-should-i-not-include-bits-stdc-h) May be unrelated to your problem, but should not be done generally. – user0042 Jul 28 '17 at 05:37
  • It's all about translation units and what you're doing in those functions. If you can declare your functions as 'static inline' the speed difference might be minimal. A compile time of 3ms vs 5ms is not something you need to worry about. – vincent Jul 28 '17 at 05:39
  • Okay, will remove it and see if it helps. – deadpoolAlready Jul 28 '17 at 05:40
  • @vincent Actually there is a huge difference in compile time. 4x-5x – deadpoolAlready Jul 28 '17 at 05:40
  • What is the 5x compile time in seconds? – vincent Jul 28 '17 at 05:41
  • around 8-10 seconds – deadpoolAlready Jul 28 '17 at 05:46
  • It would be good, if you describe previous variant in the same details. And call `gcc` via `time` in both cases. – fghj Jul 28 '17 at 05:54
  • The reason to separate the code into many compilation units is to only compile the ones that have changed. Therefore you need a build tool like make. –  Jul 28 '17 at 07:56

1 Answers1

1

There's a fixed overhead for each of your files, namely launching the actual compiler for each of them, including and parsing the "common part" (i.e. the library includes) and some per-file amount of work that the linker has to do.

Given that the actual code you wrote is minimal, the time cost of each of your file will actually be roughly always the same, and roughly equivalent to this fixed overhead; so, the behavior you are seeing is not strange.

Separating into different files starts to make sense performance-wise for essentially two reasons:

  • incremental builds; if you have a large project (with a decent build system) and touch a single file, it's only the corresponding object module that will be rebuilt (+link), which is way faster than compiling a single huge file every time;
  • parallel builds; C++ compilers are generally single thread (which is not that strange, given that most of their job is strictly sequential), so the compilation of a single file cannot exploit the parallelism provided by current CPUs; but if you split your project into separate files, the problem becomes embarrassingly parallel, given that the compilation of each TU is completely independent from the others (the object modules are tied together only at the very end by the linker); so, splitting in multiple files quickly pays off when building on multiple cores.

(this besides the obvious maintenability advantages that come from having clearly separated files for different modules/classes)

Matteo Italia
  • 123,740
  • 17
  • 206
  • 299