8

I would ideally like to be able to add (very repetitive) C/C++ code to my actual code, but at compile time, code which would come from say, the stdout of a python script, the same way one does with macros.

For example, let's say I want to have functions that depend on the public attributes of a given class, being able to just write the following in my C++ code would be a blessing:

generate_boring_functions(FooBarClass,"FooBarClass.cpp")

Is that feasible using conventional means? Or must I hack with Makefiles and temporary source files?

Thanks.

Manux
  • 3,643
  • 4
  • 30
  • 42
  • I thought Makefiles were conventional means. What did you have in mind? – David Thornley Jun 17 '10 at 17:26
  • 1
    What are the boring functions? Are they something that could be solved with a template-based solution? – Oliver Charlesworth Jun 17 '10 at 17:28
  • No it's not template-solvable, and I know Makefiles are conventional means, but I just meant something that would work just by doing the stupid simple "g++ foo.cpp -o foo", I guess that's too much to ask ;) – Manux Jun 17 '10 at 17:39
  • This question seems interesting, but it's kinda unclear what you want. A more detailed example of input/output would help. – Greg Domjan Jun 17 '10 at 18:32
  • I may be a bit late to add to this topic, but you might find the following question relevant: http://stackoverflow.com/questions/39273219/using-x-lists-and-preprocessor-directives-to-generate-configurable-c-code-at-com/39273220#39273220 – Dávid Tóth Sep 01 '16 at 13:45

7 Answers7

4

You do most likely need to tweak the Makefile a bit. It would be easy to write a (Python) script that reads each of your source files as an additional preprocessing step, replacing instances of generate_boring_functions (or any other script-macro) with the correct code, potentially just by invoking generate_boring_functions.py with the right arguments, and bypassing the need for temporary files by sending the source to the compiler over standard input.

Damn, now I want to make something like this.

Edit: A rule like this, stuck in a makefile, could be used to handle the extra build step. This is untested and added only for some shot at completeness.

%.o : %.cpp
    python macros.py $< | g++ -x cpp -c - -o $@
Jon Purdy
  • 53,300
  • 8
  • 96
  • 166
  • How would I send the source to the compiler over stdin(not using temp files)? – Manux Jun 17 '10 at 17:38
  • Indeed, but I can have python parse all files, then to concatenate all the necessary code in the right order(headers first etc...) easily, then just call something like "python generate_all_code.py | gcc -x cpp -o out -", as if I compiled one big c++ file... bad idea? – Manux Jun 17 '10 at 18:06
  • 2
    @Manux, directly piping to gcc is likely to put you in a world of hurt when you need to debug your python output. I suggest you save to a temp file and then cat the temp file to gcc. – JSBձոգչ Jun 17 '10 at 19:08
3

If a makefile isn't conventional enough for you, you could get by with cleverly-written macros.

class FooBarClass
{
    DEFINE_BORING_METHODS( FooBarClass )

    /* interesting functions begin here */
}

I very frequently see this done to implement the boilerplate parts of COM classes.

But if you want something that's neither make nor macro, then I don't know what you could possibly mean.

JSBձոգչ
  • 40,684
  • 18
  • 101
  • 169
3

I've never used this particular technology, but it sounds as though you're looking for something like Ned Batchelder's Cog tool.

Python scripts are embedded into a C++ source file such that when run through the cog tool additional C++ code is generated for the C++ compiler to consume. So your build process would consist of an extra step to have cog produce the actual C++ source file before the C++ compiler is invoked.

Michael Burr
  • 333,147
  • 50
  • 533
  • 760
2

A makefile (or equivalent) is a "conventional" means!

Oliver Charlesworth
  • 267,707
  • 33
  • 569
  • 680
0

You could try the Boost Preprocessor Library. It's just an extension of the regular preprocessor, but if you're creative, you can achieve nearly anything in it.

Puppy
  • 144,682
  • 38
  • 256
  • 465
0

Did you have a look at PythoidC ? It can be used to generate C code.

Onkar Deshpande
  • 301
  • 4
  • 15
0

I have encountered this exact same problem multiple times.

I use it exactly in the way you describe -- (i.e. to run "boringFunction( filename.cpp, "filename.cpp") for a set of files).

It is used to generate code that "registers" the code contained in a specific set of files to a std::map, to handle adding user-written functions to the library without dynamically recompiling the whole library or relying on the (likely novice programmer) user to write syntactically correct C++ code to e.g. implement class functions.

I have solved it in two ways (which are basically equivalent)

1) A purely C++ "bootstrapping" method, in which during compilation, make compiles a simple C++ program that generates the necessary files, and then calls a second makefile that compiles the actual code generated in the temporary files.

2) A shell based method that uses bash to accomplish the same thing (I.e. use simple shell commands to iterate through the files and output new files to a temporary location, then call make on the output).

The functions can either be output to one file each, or can be output to one monolithic file for the second compilation.

Then, the functions can either be loaded dynamically (i.e. they are compiled as a shared library), or I can recompile all the rest of the code with the generated functions included.

The only hard part was (a) figuring out a way to register the function names uniquely (e.g. using preprocessor __COUNTER__ only works if it is a single monolithic file), and (b) figuring out how to reliably call the generation function in the makefile before the main makefile runs.

The advantage of the pure-C++ method (versus e.g. bash) is that it could possibly work on systems that do not have the same bash linux shell by default (e.g. windows or macOS), in which case of course a more complex cmake method is necessary..

I have included the hard parts of the makefile for posterity:

The first makefile called is:

# Dummy to compile filters first
$(MAKECMDGOALS): SCRIPTCOMPILE
    make -f Makefile2 $(MAKECMDGOALS)

SCRIPTCOMPILE:
    @sh scripts/filter_compiler_single.sh filter_stubs

.PHONY: SCRIPTCOMPILE

Where scripts/filter_compilr_single.sh is e.g.:

BUILD_DIR="build/COMPILED_FILTERS";
rm -r $BUILD_DIR
mkdir -p $BUILD_DIR

ARGSET="( localmapdict& inputmaps, localmapdict& outputmaps, void*& userdata, scratchmats& scratch, const std::map<std::string,std::string>& params, const uint64_t& curr_time , const std::string& nickname, const std::string& desc )"

compfname=$BUILD_DIR"/COMPILED_FILTERS.cpp"

echo "//// START OF GENERATED FILE (this file will be overwritten!) ////" > $compfname  #REV: first overwrites
echo "#include <salmap_rv/include/salmap_rv_filter_includes.hpp>" >> $compfname
echo "using namespace salmap_rv;" >> $compfname

flist=$(find $1 -maxdepth 1 -type f) #REV: add constraint to only find .cpp files?
for f in $flist;
do

    compfnamebase=$(basename $f) #REV: includes .cpp
    alg=${compfnamebase%.cpp}
    echo $f "  >>  " $compfname
    echo "void ""$alg""$ARGSET""{" >> $compfname
    echo "DEBUGPRINTF(stdout, \"Inside algo funct "$alg"\");" >> $compfname; #REV: debug...
    cat $f >> $compfname
    echo "}""REGISTER_SAL_FILT_FUNC(""$alg"")" >> $compfname
done

echo "//// END OF GENERATED FILE ////" >> $compfname

The second makefile Makefile2 is the normal compilation instructions.

It is not beautiful, and I would love to find a better way to do it, but as it is, extracting even just the base filename from every file during compilation is difficult even using templates or constexpr (e.g. some macro function that takes __FILE__). And that would rely on the user remembering to add the specific macro call to their function filter stub, which is just adding extra unneccessary work and asking to introduce spelling errors etc.

rveale
  • 66
  • 1
  • 10