13

I have been developing C++ for less than a year, but in that time, I have heard multiple people talk about how horrible #define is. Now, I realize that it is interpreted by the preprocessor instead of the compiler, and thus, cannot be debugged, but is this really that bad?

Here is an example (untested code, but you get the general idea):

#define VERSION "1.2"

#include <string>

class Foo {
  public:
    string getVersion() {return "The current version is "+VERSION;}
};
  1. Why is this this code bad?
  2. Is there an alternative to using #define?
Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Joel
  • 2,654
  • 6
  • 31
  • 46
  • 3
    The inverse of this question: [Why use #define instead of a variable](http://stackoverflow.com/questions/6004963/why-use-define-instead-of-a-variable) – Matt Ball Apr 21 '12 at 18:38
  • `define`s are tricky constructs and can be easily mis-used or abused. but just like `goto` they can be very useful. – Anycorn Apr 21 '12 at 18:39
  • 1
    possible duplicate of [static const vs #define](http://stackoverflow.com/questions/1637332/static-const-vs-define), in particular this [answer](http://stackoverflow.com/a/3835772/61574) should be enough. – Anonymous Apr 21 '12 at 18:43
  • 3
    I doubt that this code compiles. Without the `+` it would, but as is ? Nope. – Matthieu M. Apr 21 '12 at 19:08

5 Answers5

15

Why is this this code bad?

Because VERSION can be overwritten and the compiler won't tell you.

Is there an alternative to using #define?

const char * VERSION = "1.2";

or

const std::string VERSION = "1.2";
Benjamin Lindley
  • 101,917
  • 9
  • 204
  • 274
  • 4
    Actually, the compiler likely *will* tell you when you redefine a macro. Even without any warning options, gcc will warn you with "warning: "FOO" redefined... note: this is the location of the previous definition". And it will do that only if the values are different, so you don't get useless warnings. – Ambroz Bizjak Apr 21 '12 at 18:50
  • Does a const have the same scope as a `#define`? I guess a `#define` doesn't really have a scope - it can go anywhere. Can a const be used anywhere as well? – Joel Apr 21 '12 at 18:53
  • 3
    @Joel: One of the biggest advantages of `const x` over `#define` is that it is scoped. – Puppy Apr 21 '12 at 18:54
  • @Joel: No, it is scoped. That's usually seen as an advantage. Do you have a case in mind where it's not? – Benjamin Lindley Apr 21 '12 at 18:55
  • In my example, if I replaced my define with a const, would the const be in the global scope? Does this mean that I could use the const anywhere in the file? – Joel Apr 21 '12 at 18:57
  • The negative effects of #define in fact depend a lot on whether it's in a header file, or in a source file that is compiled separately. In a source file, you can afford to use much shorter identifiers, including macro names, without risk of conflict, because you see all the code. This also won't collide with the headers you include assuming you always use more specific identifiers in headers (e.g. prefixing them). – Ambroz Bizjak Apr 21 '12 at 19:02
  • @AmbrozBizjak: the problem is on the other direction, the compiler (the preprocessor actually) will replace VERSION with the string *everywhere*, regardless of context. – David Rodríguez - dribeas Apr 21 '12 at 19:10
11

The real problem is that defines are handled by a different tool from the rest of the language (the preprocessor). As a consequence, the compiler doesn’t know about it, and cannot help you when something goes wrong – such as reuse of a preprocessor name.

Consider the case of max which is sometimes implemented as a macro. As a consequence, you cannot use the identifier max anywhere in your code. Anywhere. But the compiler won’t tell you. Instead, your code will go horribly wrong and you have no idea why.

Now, with some care this problem can be minimised (if not completely eliminated). But for most uses of #define there are better alternatives anyway so the cost/benefit calculation becomes skewed: slight disadvantage for no benefit whatsoever. Why use a defective feature when it offers no advantage?

So here is a very simple diagram:

  1. Need a constant? Use a constant (not a define)
  2. Need a function? Use a function (not a define)
  3. Need something that cannot be modelled using a constant or a function? Use a define, but do it properly.

Doing it “properly” is an art in itself but there are a few easy guidelines:

  1. Use a unique name. All capitals, always prefixed by a unique library identifier. max? Out. VERSION? Out. Instead, use MY_COOL_LIBRARY_MAX and MY_COOL_LIBRARY_VERSION. For instance, Boost libraries, big users of macros, always use macros starting with BOOST_<LIBRARY_NAME>_.

  2. Beware of evaluation. In effect, a parameter in a macro is just text that is replaced. As a consequence, #define MY_LIB_MULTIPLY(x) x * x is broken: it could be used as MY_LIB_MULTIPLY(2 + 5), resulting in 2 + 5 * 2 + 5. Not what we wanted. To guard against this, always parenhesise all uses of the arguments (unless you know exactly what you’re doing – spoiler: you probably don’t; even experts get this wrong alarmingly often).

    The correct version of this macro would be:

     #define MY_LIB_MULTIPLY(x) ((x) * (x))
    

But there are still plenty of ways of getting macros horribly wrong, and, to reiterate, the compiler won’t help you here.

Community
  • 1
  • 1
Konrad Rudolph
  • 530,221
  • 131
  • 937
  • 1,214
4

#define isn't inherently bad, it's just easy to abuse. For something like a version string it works fine, although a const char* would be better, but many programmers use it for much more than that. Using #define as a typedef for example is silly when, in most cases, a typedef would be better. So there's nothing wrong with #define statements, and some things can't be done without them. They have to be evaluated on a case by case basis. If you can figure out a way to solve a problem without using the preprocessor, you should do it.

Kyle
  • 1,111
  • 1
  • 12
  • 27
2

I would not use #define to define a constant use static keyword or better yet const int kMajorVer = 1; const int kMinorVer = 2; OR const std::string kVersion = "1.2";

Herb sutter has an excellent article here detailing why #define is bad and lists some examples where there is really no other way to achieve the same thing: http://www.gotw.ca/gotw/032.htm.

Basically like with many things its fine so long as you use it correctly but it is easy to abuse and macro errors are particularly cryptic and a bugger to debug.

I personally use them for conditional debug code and also variant data representations, which is detailed at the end of the sutter article.

EdChum
  • 376,765
  • 198
  • 813
  • 562
1

In general the preprocessor is bad because it creates a two pass compilation process that is unsafe, creates difficult to decode error messages and can lead to hard-to-read code. You should not use it if possible:

const char* VERSION = "1.2"

However there are cases where it is impossible to do what you want to do without the preprocessor:

#define Log(x) cout << #x << " = " << (x) << endl;
Andrew Tomazos
  • 66,139
  • 40
  • 186
  • 319
  • Compilation in C++ actually takes way more than two passes. – Konrad Rudolph Apr 21 '12 at 18:46
  • 1
    I do not know what you mean by “logical passes”. Or rather, I don’t see how that would be inherently unsafe. – Konrad Rudolph Apr 21 '12 at 18:59
  • 1
    Consider (A) the original code, (B) the preprocessed intermediate form of the code; and (C) the annotated parse tree. The two "logical passes" I am referring to are preprocessing `A->B` and compilation `B->C`. – Andrew Tomazos Apr 21 '12 at 19:21
  • Yes, that’s what I thought. Why is that inherently unsafe? Are you referring to the fact that `C` cannot contain annotations from `A` since these are lost in the transition `A->B`? This isn’t an inherent problem, just one with current implementations. Clang in fact handles this just fine (as in, the annotated parse tree contains all information from the unpreprocessed source), and as far as I know there are efforts underway to make GCC also handle this. – Konrad Rudolph Apr 21 '12 at 19:24
  • 1
    It is deeper than that - by specification they are logically separated - the preprocessor resolves all its potentially (turing complete) complex changes on the source, and then the result is compiled (again). No matter how many compiler/ide features you throw at that, it is never going to be as simple to see what is going on as for example preprocessor-free Java. – Andrew Tomazos Apr 21 '12 at 19:40