6

I'm using a 3rd party open-source application that does something I think is strange. I'd like to hear your opinion on whether you think this is wrong / evil / an abomination / etc., or if there is some justifiable reason to do this.

Simply put, they use #include pre-proc directives to include "header files" that include fragments of code. Not prototypes of functions. Not in-line functions. Just sections of code.

Here is a simple example. First the main.cpp file:

#include <iostream>
//Other "normal" includes here...

int main(int argc, char *argv[]) {

  cout << "Initializing program..." << endl;
  #include "parseArgs.h"

  // ... remainder of the program

  cout << "Exiting." << endl;
  return 0;
}

And within the parseArgs.h header file, a small code fragment. Note that this is exactly and only what is in the parseArgs.h file. This is not part of a function. There are no include guards, just the following 4 lines:

argList args(argc, argv);
if(!args.valid()) {
  cout << "Invalid arguments.";
  exit(1);
}

In the real program, there are several of these #include directives, with each one doing another small task.

It seems dangerous and crazy. I don't know why they don't write and call these as functions.

Your ideas and opinions?

Deduplicator
  • 44,692
  • 7
  • 66
  • 118
Madeleine P. Vincent
  • 3,361
  • 5
  • 25
  • 30
  • I don't think it's particularly dangerous, but does sound crazy and harder to maintain. – Alex Chamberlain Mar 11 '13 at 10:43
  • 3
    I would not recommend it, since it make the code hard to follow and maintain, but there's nothing technically wrong with it. – Some programmer dude Mar 11 '13 at 10:43
  • Normally people use functions and similar stuff to reuse code... – PlasmaHH Mar 11 '13 at 10:44
  • I especially like how the included code fragment depends on local variables (that may not exist, or exist but be something completely different from what it expects them to be), and a `using namespace std;` statement having been executed prior to inclusion. Looks like a disaster waiting to happen. – Frédéric Hamidi Mar 11 '13 at 10:47
  • Maybe the person who did that came from PHP world. Because that is a pretty much acceptable practice in the PHP community – TravellingGeek Mar 11 '13 at 11:06
  • @GeraldSv If there was some kind of cross-pollination going on, it must have been the other way around, because OpenFOAM started out back in the late eighties, quite a few years before PHP even existed. – Michael Wild Mar 11 '13 at 11:36
  • The small code fragment in the parseArgs.h header file you posted consists of 5 lines, instead of 4 lines. – L. F. Apr 16 '19 at 11:15

5 Answers5

9

I think you're talking about OpenFOAM here. The problem that these code snippets solve is that of avoiding the duplication of boilerplate code that many applications in OpenFOAM need. Putting this code in a function won't solve the problem, because the variables declared inside a function are local to its scope. One could perhaps come up with a scheme of base classes that contain these variables as members. But that would just add another layer of indirection that doesn't really solve anything. You're still dependent on variable names (or, if you want to make it clean, getter-names).

Personally, I'm undecided on whether this practice is good or bad. It is how it is, and it is part of the OpenFOAM code culture (or local lingo, if you want). At first sight it's surprising, but one gets used to it pretty fast.

However, unless you are developing OpenFOAM applications/extensions yourself, I would strongly discourage this practice. OpenFOAM is somewhat unique in that it contains virtually hundreds of executables that all require some overlapping boilerplate code that would be hard to maintain otherwise. If you're not in that situation, then don't do it.

Michael Wild
  • 24,977
  • 3
  • 43
  • 43
  • inline functions solve these issues, are more idiomatic and are safer. – Alex Chamberlain Mar 11 '13 at 11:16
  • No, they don't. If you take the example given by the OP, the thing that is missing there is that the `args` variable declared by the snippet is going to be used by the elided part. A inline function doesn't give you that. And -- as I said -- this is part of the domain-specific *language* of OpenFOAM. I would strongly recommend against doing this outside of OpenFOAM. – Michael Wild Mar 11 '13 at 11:20
  • 1
    Confused... `inline argList make_args(int argc, char *argv[]) { return argList args(argc, argv); }`; add error checking for production code. – Alex Chamberlain Mar 11 '13 at 11:23
  • And how do you return, say 10 variables? Sure, you can use `boost::tuple` or `std::tuple` if your compiler is new enough, but OpenFOAM started out at a time when there wasn't even a C++ standard library around and CFront was the only compiler that kind-of-supported templates. – Michael Wild Mar 11 '13 at 11:27
  • If you have 10 variables to *return*, you should be using some sort of state object. – Alex Chamberlain Mar 11 '13 at 11:28
  • As I said, it's not the way one should approach this problem in modern code. It's mainly historic ballast and it is part of the local coding style. If you don't like it, don't use OpenFOAM. BTW: OpenFOAM has IMHO far larger issues (\*cough\* static (de-)initialization order fiasco \*cough\*)... – Michael Wild Mar 11 '13 at 11:30
  • In deed, this is OpenFOAM. I don't really like this #include of code, but I'll accept and work with it. Not much choice really. – Madeleine P. Vincent Mar 11 '13 at 15:20
  • @MadeleineP.Vincent No, not really ;-) However, when you're writing your own code, you're free to not use this scheme and directly write the corresponding code. – Michael Wild Mar 11 '13 at 15:22
  • Why don't the use MACROS? – Aniket Navlur Nov 28 '18 at 10:33
  • They do that in a lot of places, e.g. for the repetitive code defining template variations. But here you would have even less indication as to where the "pasted" code comes from, as you'd have to guess which include file defines the macro in question. – Michael Wild Nov 29 '18 at 19:22
0

I would hesitate to call it "crazy" as there might be reasons for this type of #include usage. The problem is to understand these reasons and to judge if this is the best approach in the given context. On reason I could imagine is, that the code is generated in some way instead of being hand-written.

OTOH it does seems strange and it does have a certain code-smell. I would also be curious to find out what exactly the reasons are and, if it turns out there are no or the wrong reasons, it's a good candidate for refactoring.

Daniel Frey
  • 55,810
  • 13
  • 122
  • 180
0

I think that this is very messy and and you will be better of writing a function.

Here is a question that I found on SO asking "Why it's valid to include a header file twice in c++?" You might find the answers interesting, I would never write my code this way as I think that fixing bugs and any other problems would be painful and time consuming

Community
  • 1
  • 1
LegendaryLaggy
  • 455
  • 1
  • 6
  • 16
0

I would not call that kind of thing "crazy" - I would use the terms "unusual", "hard to understand", "unexpected" and therefore "hard to read, debug and maintain". Or just "WTF" - decreasing code quality.

Never ever do that. Use functions. Or if it really, really must be, use a macro that does exactly the same but where people are way more familiar with. Yes, macros are bad, and can become a pita when it comes to debugging. But this is worse.

Edit: to clarify: I don't like macros. I avoid them where possible. Mostly. Use functions, templates, anything. But when it comes down to "macro or WTF-#include", use a macro.

Arne Mertz
  • 24,171
  • 3
  • 51
  • 90
  • In this case, macros are worse. Macros are good for single-statement boilerplate, however longer pieces of code they are troublesome as they effectively disable your debugger. – Michael Wild Mar 11 '13 at 11:16
  • 1
    Yes I know that. I wrote that. Debugging macros is no fun. So this use of `#include` maybe easier to debug than macros, but thats as good as it gets. It's far more obfuscating than macros. It is worse than macros *in every aspect other than debugging*. – Arne Mertz Mar 11 '13 at 11:32
0

If you're working in a team, there's a need to establish procedures and standards to ensure clean communication. Once that's solved, it doesn't matter if you speak Portuguese or Mandarin, as long as everyone is comfortable with that.

It's a mindset thing. Some people - including teachers I work with - are enslaved by rules; however, I can't take them seriously, specially when one of them, in order to teach OO, uses as examples 'lettuce' and 'fruits'. I suspect there's a tinge of Kohlberg's stages slowing down the thinking of some people.

I use this technique - include source.cpp - in personal projects. I've done it in C and also Lazarus (Pascal), and it serves my purposes better than fiddling around with linker parameters and makefiles. My code has comments, variable declarations are aligned, assignment operators are also aligned, when a student give me code I go through all of it inserting spaces and lines until it looks the way I like before analysing it. I'm obsessed with clean source code organization. You see? There's method in madness!

Tools (#include, #define, lambda...) are just that: tools, but of course, unexpected use of a tool can be disorientating; that's bad. More importantly, when the "psychological complexity of software" (Google it) hurts (you or someone in your team), stop immediately and pounder what's going wrong.

That's my 2cts.

A Koscianski
  • 115
  • 1
  • 5