23

I'm using (GNU) Make in my project. I'm currently putting one makefile per directory and specify the subdirectories using SUBDIRS. It's been suggested to me that this is not the ideal way of using make, that using a one toplevel make file (or several, split up using include). I've tried migrating/using this layout in the past, but it appears to me that it's unnecessary complicated.

Which are the benefits/drawbacks of using recursive makefiles?

Bill the Lizard
  • 398,270
  • 210
  • 566
  • 880
Johan Dahlin
  • 25,300
  • 6
  • 40
  • 55

8 Answers8

24

The first thing you should keep in mind (just to eliminate any misunderstanding) is that we're not talking about a single vs. multiple makefiles. Splitting your makefile in one per subdirectory is probably a good idea in any case.

Recursive makefiles are bad primarily because you partition your dependency tree into several trees. This prevents dependencies between make instances from being expressed correctly. This also causes (parts of) the dependency tree to be recalculated multiple times, which is a performance issue in the end (although usually not a big one.)

There are a couple of tricks you need to use in order to properly use the single-make approach, especially when you have a large code base:

First, use GNU make (you already do, I see). GNU make has a number of features which simplifies things, and you won't have to worry about compatibilities.

Second, use target-specific variable values. This will allow you to have, for example, different values of CFLAGS for different targets, instead of forcing you to have a single CFLAGS in your entire make:

 main: CFLAGS=-O2
 lib: CFLAGS=-O2 -g

Third, make sure you use VPATH/vpath to the full extent supported by GNU make.

You also want to make sure that you do not have multiple source files with the same name. One limitation of VPATH is that it does not allow you to have target-specific VPATH definitions, so the names of your source files will have to co-exist in a single "VPATH namespace".

JesperE
  • 63,317
  • 21
  • 138
  • 197
  • Great answer, thanks! I'm actually using VPATH even for a project using recursive make files, it makes like a lot easier, especially for srcdir != builddir builds. – Johan Dahlin Nov 26 '08 at 14:09
19

An article entitled "Recursive Make Considered Harmful" can be found here: http://miller.emu.id.au/pmiller/books/rmch/?ref=DDiyet.Com. (Or at the Aegis project at SourceForge.)

It explores the problems with recursive makefiles, and recommends a single-makefile approach.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
Kristopher Johnson
  • 81,409
  • 55
  • 245
  • 302
7

I use recursion extensively. each leaf node will have its own makefile, consider:

$LIBS = libfoo libbar
$(LIBS):
    cd $@ && $(MAKE)

On large systems it would be quite a challenge to not have this structure. What folks talk about "recursive make considered harmful" is an accurate assessment. I think each situation is a little different and we make some compromises.

Kramer
  • 174
  • 3
  • 5
4

Run, don't walk, to cmake.org and get Cmake, one of the best build tools available.

You will still be using GNU make, but in this case CMake will generate the makefiles for you.

I can't guarantee 100%, but I have yet to come across a case where it has not correctly handled dependencies between subdirectories correctly (ie the problem that plagues the recursive make). At the very least it is a lot easier to maintain Cmakefiles than makefiles. Highly recommended.

Do not use GNU autotools - that way madness lies!

Alastair
  • 4,475
  • 1
  • 26
  • 23
  • 1
    I already know GNU autotools too well and I'm getting increasingly old to learn a new system. Maybe in another 5 years :-) – Johan Dahlin Nov 26 '08 at 14:16
  • 3
    CMake is worth the effort needed to learn. It is much easier than Autotools, and it's cross-platform support is much better. – Kristopher Johnson Nov 29 '08 at 14:57
  • 13
    I have found make to have a much more shallow learning curve than CMake. CMake's syntax is in constant flux, the documentation is awful, and in the end it is still unable to automatically find the locations of common libraries intalled on systems, so you're constantly hunting for "FindLibraryXYZ.CMake" cmake scripts to include with your project. There is nothing elegant about it, so only use it if you really need it. Sorry for the rant. – Andrew Wagner Apr 12 '11 at 07:10
  • Cmake (and Autotools) may be good for larger projects, but in the majority of cases it is overkill and an unnecessary dependency IMO. – j b Nov 07 '14 at 14:22
  • Also, by the looks of it, Cmake itself generates recursive makefiles so I fail to see the advantage of it if all that's needed is a simple build system. – Vigintas Labakojis May 15 '20 at 10:25
0

The benefit that I've gotten from this in the past is that it's easier to build files in a single subdirectory. You can do this with dependencies, but it's a bit more work to keep all of the targets straight. Basically, this makes it easier to make changes and test one library without having to deal with the full complexity of the larger project.

Dana the Sane
  • 14,762
  • 8
  • 58
  • 80
0

To throw in a third option, you could use GNU Autotools. Mostly used for other reasons, but may also helpful at organizing a multi-directory build.

http://www.lrde.epita.fr/~adl/autotools.html

It has to be noted, though, that the result is a recursive version.

ypnos
  • 50,202
  • 14
  • 95
  • 141
0

Makepp is your friend.

http://makepp.sourceforge.net/

  • Backwards compatible with make
  • Automatic scanning for header dependencies
  • Graceful handling of directory trees
Andrew Wagner
  • 22,677
  • 21
  • 86
  • 100
-1

The issue with recursive make is the time overhead of evaluating all the different make files vs. evaluating one large make file. Part of this is just spawning processes but also (IIRC) you tend to be forced into assuming that other makes files did something and rebuilding when you don't really need to.

My take on it is to have a single make file per "Unit", that more or less amounts to having a make file for each chunk of code that you expect could be used on it's own (e.g. as an independent library)

OTOH my current project breaks this all over the place as I'm generating make files during the build. :b

BCS
  • 75,627
  • 68
  • 187
  • 294
  • 1
    No, time is not the biggest problem. The biggest problem is partitioning the dependency tree into several dependency trees, which prevents you from properly expressing dependencies across sub-makes. – JesperE Nov 26 '08 at 12:13
  • 3
    IMNSHO the only reason you even care about dependencies at all *is* time. If you don't care about time, just do a from scratch, blank slate rebuild every time. You can get "correct" dependencies expressed by coding so if anything changes, it's assumed that everything changed, but it takes more time. – BCS Nov 26 '08 at 17:26