6

I have a project that involves sub-directories with sub-makefiles. I'm aware that I can pass variables from a parent makefile to a sub-makefile through the environment using the export command. Is there a way to pass variables from a sub-makefile to its calling makefile? I.e. can export work in the reverse? I've attempted this with no success. I'm guessing once the sub-make finishes its shell is destroyed along with its environment variables. Is there another standard way of passing variables upward?

Andrew
  • 183
  • 1
  • 2
  • 9
  • What are you trying to do exactly? You might be able to do it with an include directive. Otherwise, your only communications channel is the filesystem. – rici Jul 14 '15 at 18:34
  • You could do something like this, but as a rule of thumb anytime you're having to hack in support for a major feature like this it's often a bad idea. – Brian Vandenberg Jul 14 '15 at 18:35
  • One of my sub-directories is an independent code from the main code. It generates several object files that need to be linked with the main code. These object files are dependent upon each other so the order in which they are linked matters. In the sub-makefile I have an OBJS variable which contains the list of object files in the correct order. I would just like to avoid reconstructing the same list in the top-level makefile. – Andrew Jul 14 '15 at 18:46
  • I should also mention the sub-directory is Fortran code, while the main code is C++. The object files contain wrappers to use the Fortran code. I don't think I can solve this with an include directive, can I? – Andrew Jul 14 '15 at 18:49
  • Initially I was inclined not to up-vote this because it doesn't seem like a wise thing to do; however, this is certainly a question others may ask themselves and finding out it's not a good idea would be valuable for them. – Brian Vandenberg Jul 14 '15 at 19:44

3 Answers3

2

The short answer to your question is: no, you can't [directly] do what you want for a recursive build (see below for a non-recursive build).

Make executes a sub-make process as a recipe line like any other command. Its stdout/stderr get printed to the terminal like any other process. In general, a sub-process cannot affect the parent's environment (obviously we're not talking about environment here, but the same principle applies) -- unless you intentionally build something like that into the parent process, but then you'd be using IPC mechanisms to pull it off.

There are a number of ways I could imagine for pulling this off, all of which sound like an awful thing to do. For example you could write to a file and source it with an include directive (note: untested) inside an eval:

some_target:
  ${MAKE} ${MFLAGS} -f /path/to/makefile

some_other_target : some_target
  $(eval include /path/to/new/file)

... though it has to be in a separate target as written above because all $(macro statements) are evaluated before the recipe begins execution, even if the macro is on a later line of the recipe.

gmake v4.x has a new feature that allows you to write out to a file directly from a makefile directive. An example from the documentation:

If the command required each argument to be on a separate line of the input file, you might write your recipe like this:

program: $(OBJECTS)
        $(file >$@.in) $(foreach O,$^,$(file >>$@.in,$O))
        $(CMD) $(CMDFLAGS) @$@.in
        @rm $@.in

(gnu.org)

... but you'd still need an $(eval include ...) macro in a separate recipe to consume the file contents.

I'm very leery of using $(eval include ...) in a recipe; in a parallel build, the included file can affect make variables and the timing for when the inclusion occurs could be non-deterministic w/respect to other targets being built in parallel.

You'd be much better off finding a more natural solution to your problem. I would start by taking a step back and asking yourself "what problem am I trying to solve, and how have other people solved that problem?" If you aren't finding people trying to solve that problem, there's a good chance it's because they didn't start down a path you're on.


edit You can do what you want for a non-recursive build. For example:

# makefile1
include makefile2

my_tool: ${OBJS}


# makefile2
OBJS := some.o list.o of.o objects.o

... though I caution you to be very careful with this. The build I maintain is extremely large (around 250 makefiles). Each level includes with a statement like the following:

include ${SOME_DIRECTORY}/*/makefile

The danger here is you don't want people in one tree depending on variables from another tree. There are a few spots where for the short term I've had to do something like what you want: sub-makefiles append to a variable, then that variable gets used in the parent makefile. In the long term that's going away because it's brittle/unsafe, but for the time being I've had to use it.

I suggest you read the paper Recursive Make Considered Harmful (if that link doesn't work, just google the name of the paper).

Brian Vandenberg
  • 4,011
  • 2
  • 37
  • 53
  • 3
    You're right on many points. I suppose the question doesn't arise because I was heading down the wrong path to begin with. I've decided to just create an archive of the object files in the sub-makefile and link to it in the top-level makefile. This way the order and dependencies are preserved. – Andrew Jul 14 '15 at 19:13
  • 1
    That's a much better solution. – Brian Vandenberg Jul 14 '15 at 19:23
0

Your directory structure probably looks like this:

my_proj
|-- Makefile
|-- dir1
|   `-- Makefile
`-- dir2
    `-- Makefile

And what you are doing in your parent Makefile is probably this:

make -C ./dir1
make -C ./dir2

This actually spawns/forks a new child process for every make call.
You are asking for updating the environment of the parent process from its children, but that's not possible by design (1, 2).

You still could work around this by:

  • using a file as shared memory between two processes (see Brian's answer)
  • using the child's exit error code as a trigger for different actions [ugly trick]
Community
  • 1
  • 1
Eugeniu Rosca
  • 5,177
  • 16
  • 45
  • Yes I'm using `$(MAKE) -C ./dir1`. I was guessing the problem was that the child process couldn't update the parent's environment. – Andrew Jul 14 '15 at 19:49
0

I think the simplest solution is using standard out from a sub Makefile.

Parent Makefile

VAR := $(shell $(MAKE) -s -C child-directory)

all:
        echo $(VAR)

Child Makefile

all:
        @echo "MessageToTheParent"
th yoo
  • 1
  • 1