28

This question may seem rather basic, but coming from an engineering (non computer-science) background, I was unsure about what the snippets of '#'s were in some C++ code.

A quick search led me to the concise, well-explained cplusplus tutorial page on preprocessor directives.

But why bother with the concept of preprocessor directives at all? Is it not possible to write equivalent code that can assign values to constants, define subroutines/function/macros and handle errors?

I guess I ultimately want to know when it is good practice to use such preprocessor directives, and when it is not.

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Zaid
  • 36,680
  • 16
  • 86
  • 155

14 Answers14

31

You use preprocessor directives when you need to do something outside of the scope of the actual application. For instance, you'll see preprocessing done to include or not include code based on the architecture the executable is being built for. For example:

#ifdef _WIN32 // _WIN32 is defined by Windows 32 compilers
#include <windows.h>
#else
#include <unistd.h>
#endif

Preprocessor directives are also used to guard includes so that classes/functions etc. are not defined more than once.

#ifndef MY_CLASS_6C1147A1_BE88_46d8_A713_64973C9A2DB7_H_
#define MY_CLASS_6C1147A1_BE88_46d8_A713_64973C9A2DB7_H_
    class MyClass {
    ....
    };
#endif

Another use is for embedding versioning inside of code and libraries.

In the Makefile you have something along the lines of:

-D MY_APP_VERSION=4.5.1

While in the code you have

cout << "My application name version: " << MY_APP_VERSION << endl;
RC.
  • 27,409
  • 9
  • 73
  • 93
  • 1
    This is especially important in cases where there are functions that exist on one platform but not another -- like compiler intrinsics, which wrap SSE opcodes on the x86 and VMX opcodes on PPC, etc. In this case there is no viable alternative to using #ifdef, because templates will try to compile code using the nonexistent functions, even if only to later throw them away. – Crashworks Nov 24 '09 at 01:54
  • One nitpick: WIN32 is defined because that's the default project settings. The compiler itself defines _WIN32 (I believe MinGW or other windows-based compilers define this), and optionally _WIN64, to describe the platform. It also defines _MSC_VER as the compiler version itself. – Tom Nov 24 '09 at 02:54
  • Hah.. sorry... another nitpick. `__MY_HEADER_H__` is bad, since `__[A-Z]` is technically reserved for the implementation. Most compilers define *many* configuration settings, so it's not unlikely to eventually run into a weird macro-related error from this. – Tom Nov 24 '09 at 02:57
  • 2
    `/^_[A-Z]/` is also reserved, as is `/__/`. So you can't have _MY_CLASS_H either. And before you ask, `/^_[a-z]/` is reserved in the global namespace, so that's also not suitable for use as a macro. This is why `#pragma once` is so attractive - never mind compile speed, it saves you having to pick a name for the include guard. – Steve Jessop Nov 24 '09 at 13:12
  • there's no fundamental reason for `#pragma once` to be any faster than traditional include-guards. I just tested this by running `strace` on `gcc`, and properly-guarded headers are never opened more than once. I doubt `cl.exe` has this optimization, since Microsoft has a vested interest in non-portable options and vendor lock-in. – Tom Dec 12 '09 at 02:07
10

Answer 1: conditional code that has to vary depending on what sort of computer it works on.

Answer 2: enabling and disabling language extensions and compatibility features seen at compile time.

The preprocessor came from C, where there were many thing you could not express. Good C++ code finds less reasons to use it than C code did, but sadly it's not quite useless.

bmargulies
  • 97,814
  • 39
  • 186
  • 310
  • 2
    Also, since C++ has "inline" you should not use #define macros in C++ to try for speed. See also http://google-styleguide.googlecode.com/svn/trunk/cppguide.xml#Preprocessor_Macros – Harold L Nov 23 '09 at 20:15
  • @Harold: Where is inline mentioned in this answer? – Martin York Nov 23 '09 at 23:37
8

Preprocessing occurs before the code is compiled. It's appropriate in cases like the following

#ifdef WIN32
#include <something.h>
#elseif LINUX
#include <somethingelse.h>
#endif

Obviously including header files you want done at compilation time not runtime. We can't do this with variables.

On the other hand. With C++ it is good practice and greatly encouraged to replace constant expressions like the following example

#define PI 3.141592654
with
const double PI=3.141592654;

The reason is that you get proper typecasting and datatype handling.

Also

#define MAX(x,y) (x) > (y) ? (x) : (y)

Is not very nice because you can write

int i = 5
int max = MAX(i++, 6);

The preprocessor would replace that with:

int max = (i++) > (6) ? (i++) : (6);

Which is clearly not going to give the intended result.

Instead, MAX should be a function (not a macro). If it's a function it can also provide the type in the parameter.

I have seen the preprocessor used for all sorts of interesting things. Like language keyword declaration. It can aid in readability in this case.

In short, use the preprocessor for things that must happen at compile type such as conditional includes directives. Avoid using it for constants. Avoid macros and instead use functions where possible.

Community
  • 1
  • 1
hookenz
  • 36,432
  • 45
  • 177
  • 286
  • Unlikely that I would ever have to go for cross-platform compatibility, but I get the picture. – Zaid Nov 23 '09 at 20:31
  • 1
    Note that max() is already in the C++ standard library, all nicely templated and stuff. – ceo Nov 23 '09 at 21:59
7

Because preprocessor directives get executed at build time, while code you write will get executed at run time. So preprocessor directives effectively give you the ability to modify your source code programmatically.

Note that the C preprocessor is a rather crude mechanism for this kind of thing; C++'s template system provides a much more powerful framework for compile-time construction of code. Other languages have even more powerful metaprogramming features (for example, Lisp's macro system).

Daniel Pryden
  • 59,486
  • 16
  • 97
  • 135
  • 3
    To nitpick a bit: preprocessor directives get executed before compile time. – Nemanja Trifunovic Nov 23 '09 at 20:09
  • @Nemanja: True. Perhaps I should have said "executable generation time". The difference is mainly an implementation detail of the compiler toolchain, however. – Daniel Pryden Nov 23 '09 at 20:10
  • 1
    Well, no, it does need to be *pre* processed before the compiler looks at the code. – Ed S. Nov 23 '09 at 20:11
  • OK, I've changed my answer. I was using "compile time" in the general sense, ignoring the details of the C toolchain. – Daniel Pryden Nov 23 '09 at 20:15
  • Build time vs runtime throws up an interesting point. Can appreciable speed gains be realized through using preprocessor directives? – Zaid Nov 23 '09 at 20:25
  • @Zaid: Most use of preprocessor directives is oriented toward reduction of executable size (e.g. by removing debugging code) rather than speed, but there are definite speed tricks you can do with macros. Many of them are better implemented with other C++ features, however. – Daniel Pryden Nov 23 '09 at 20:30
  • Yes, for instance, if your application was calling a method to set some value inside of a tight loop the macro will speed it up because you will avoid the overhead of calling the function. – Ed S. Nov 23 '09 at 20:31
  • Daniel & Ed: Thanks for the insight. – Zaid Nov 23 '09 at 20:34
  • @Ed: which would be done better in most cases by using `inline` functions instead of macros. – Georg Fritzsche Nov 23 '09 at 20:45
  • @gf: No, "inline" is a *suggestion* to the compiler that may be ignored at will. These days practical use of the "inline" directive is becoming rare, the compiler usually does a better job of determining if/when a function should be inlined. – Ed S. Nov 23 '09 at 21:45
  • 1
    @Ed: if it's ever true that the compiler does a better job of deciding whether to inline code than the programmer, then it's not also true in general that using a macro inside a tight loop will speed things up compared with an equivalent function. It might speed things up, or it might slow them down if the compiler correctly determined that inlining in that case was a pessimization. It might make no difference at all, if the compiler would have inlined the call anyway. And the answer is probably different on different compilers and platforms. – Steve Jessop Nov 24 '09 at 13:05
6

It is used most frequently for two things which will be harder to organize without it:

  1. Include guards.
  2. Different sections of code for different platforms.
epochwolf
  • 12,340
  • 15
  • 59
  • 70
alexkr
  • 4,580
  • 1
  • 24
  • 21
3

Many programming languages have meta-programming facilities, where you write code for the compiler to follow, rather than the runtime environment.

For example, in C++, we have templates which allow us to instruct the compiler to generate certain code based on a type, or even a compile-time constant. Lisp is perhaps the most famous example of a language with advanced meta-programming facilities.

C preprocessor directives/macros are just another form of "meta-programming", albeit a relatively cruder form than is available in other languages. Preprocessor directives instruct the compiler to do certain things at compile time, such as to ignore certain code on certain platforms, or to find and replace a string in the code with another string. This would be impossible to do at runtime, after your code is already compiled.

So essentially, the C-preprocessor is an early form of "meta-programming", or compiler-programming.

Charles Salvia
  • 52,325
  • 13
  • 128
  • 140
  • Hmm... never heard of meta-programming before. I like the way you generalized the usage of preprocessor directives though. – Zaid Nov 23 '09 at 20:51
2

Generally, preprocessor directives should not be used. Sadly, sometimes you have to in C and C++.

C originally defined the language in such a way that you really couldn't do anything serious with it without using the preprocessor. The language had no other built in support for creating modular programs, constants, inlined code, or to do generic programming.

C++ gets rid of most of these issues but the facility is still there, so it still gets used. (Interestingly, not the modularization one. We're still stuck with #include),

If you want to compare with a language built at a similar-level of abstraction for similar tasks that does not have a preprocessor, go take a look at Ada.

T.E.D.
  • 44,016
  • 10
  • 73
  • 134
  • I wonder what is so bad about the include directive. – user13947194 Jun 19 '22 at 09:47
  • @user13947194 - Compared to a proper modularization facility, #include is incredibly problematic. One of its worst problems is that people think its a modularization facility when its really a rote code copying facility, which can both lead to some really hard to track down bugs, and to some seriously astonishing uses (eg: There are some interesting things you can do by #including the same file multiple times into different places in the same source file) – T.E.D. Jun 19 '22 at 14:18
  • What would you suggest as an alternative? Keeping in mind that Java import does not do the same thing/ similar thing that include does. In Java, you include files via javac command line. In C/C++ you use the command line to point to a directory. And use includes to point to the relative file paths. – user13947194 Jun 20 '22 at 17:52
  • @user13947194 - Proper languages are *supposed* to have a full-blown language feature for this kind of thing. It looks like C++20 has a ["module" import/export](https://en.cppreference.com/w/cpp/language/modules) statement set for this, but I've never had a chance to use a compiler that supports it, so I can't comment on its utility. – T.E.D. Jun 20 '22 at 21:06
  • I guess you dont have the time. Because just saying you are supposed to... Why are they supposed to? Is C/C++ not a proper language? At the end of the day it doesn't really matter. It was an opinionated question. Can only be answered with opinion. It is as opinionated as asking if yellow is better than blue. I have no love or hate for include directive. You guys would rather to do away with it. I can't fathom why is my only problem. But that is just a minor problem. – user13947194 Jun 21 '22 at 05:53
2

A bit of history here: C++ was developed from C, which needed the preprocessor a lot more than C++ did. For example, to define a constant in C++, you'd write something like const int foo = 4;, for example, instead of #define FOO 4 which is the rough C equivalent. Unfortunately, too many people brought their preprocessor habits from C to C++.

There are several reasonable uses of the preprocessor in C++. Using #include for header files is pretty much necessary. It's also useful for conditional compilation, including header include guards, so it's possible to #include a header several times (such as in different headers) and have it processed only once. The assert statement is actually a preprocessor macro, and there are a few similar uses.

Aside from those, there are darn few legitimate uses in C++.

David Thornley
  • 56,304
  • 9
  • 91
  • 158
  • `#include`, and the associated `#ifdef...#define` are only required because neither C nor C++ has seen fit to create a proper interface facility, as can be found in Pascal, Modula-2, Ada, and pretty much every modern language. This is just a preprocessor hack that everyone has gotten used to. – T.E.D. Nov 23 '09 at 22:23
1

No, it actually is not possible to get by without the preprocessor in all cases. One of my favorite macros is

#define IFDEBUG if(DEBUG==1)
//usage:
IFDEBUG{
  printf("Dump of stuff:.,,,");
}else{
  //..do release stuff
}

Without macros, I would have (possibly a lot) space wasted in the final executable

And also you must realize, C/C++ doesn't have any sort of package type require or other such system. So without the preprocessor, there is no way of preventing code-duplication. (header files can't be included)

Earlz
  • 62,085
  • 98
  • 303
  • 499
  • 4
    This will leave an "if" test in the final executable: you want #ifdef guards around the DEBUG-only code and then to compile with or without -DDEBUG to control whether the printf is in the program code. – Harold L Nov 23 '09 at 20:11
  • Your correct, I gave a bad example. But I would pray I was using a competent enough compiler to optimize away `if(0){..}` – Earlz Nov 23 '09 at 20:18
  • Bad example, IMO; why not simply '#if defined DEBUG'? – Clifford Nov 23 '09 at 20:18
  • 1
    I've learned to use enums for this expression, it's actually nice because you can localize the declaration to a file/scope (I'm not the type to #define #define #undef #define...) – justin Nov 23 '09 at 20:18
  • Example: /* This is defined at the root namespace, and may be localized in scope when necessary */ enum { LogEnabled = 0 }; /* usage: */ if (LogEnabled) { printf("Dump of stuff:.,,,"); } /* ..do release stuff */ – justin Nov 23 '09 at 20:24
  • Actually, I *hate* this style. When I'm reading someone else's code to track down a problem, I'm never really sure if I should be seeing those debug messages or not. – T.E.D. Nov 23 '09 at 20:32
  • I prefer having `if (DEBUG == 1) {...}` to `#if DEBUG==1` because all compilers I've dealt with in the last 10+ years will optimize the `if` test away when the condition was constant - but the compiler still has to at least syntax check the code. I hate it when I try to enable the debug stuff and that code has gone stale and doesn't compile anymore. This happens far more often than I'd like. – Michael Burr Nov 23 '09 at 21:05
1

But why bother with the concept of preprocessor directives at all? Is it not possible to write equivalent code that can assign values to constants, define subroutines/function/macros and handle errors?

Its use is very low in C++, features of the language were created to avoid problems associated with the preprocessor.

I guess I ultimately want to know when it is good practice to use such preprocessor directives, and when it is not.

In general C++ sources, it is often seen as poor practice - particularly when there is a mean to do it using other language features. It is required for some things (i.e. platform/build dependent programs and generative programs). In short, there's usually a replacement which scales well. (such as constant define as enum, or inline templates instead of macros). If you find yourself using one and you are not sure, then just ask/search if there is a better way to declare this_code_snippet in C++, without the preprocessor.

justin
  • 104,054
  • 14
  • 179
  • 226
0

It's a better-than-nothing substitute to get some reflection capabilities out of C++.

Very useful to generate variables and strings with the same names.

Macke
  • 24,812
  • 7
  • 82
  • 118
0

Answered here.

Community
  • 1
  • 1
Dima
  • 38,860
  • 14
  • 75
  • 115
0

The C preprocessor performs a number of tasks, some but not all of which have better alternatives in C++. Where C++ has a better alternative use it. Those alternatives include templates, inlining, and const variables (an oxymoron, but that is what the standard calls them) in place of #define macros.

However there are few things that you would not want to do without or simply cannot do without; #include for example is essential, and when coding for multiple platforms or configurations, conditional compilation remains useful (although should be used sparingly in all cases).

Compiler specific extensions controlled via #pragma may be unavoidable in some cases.

Clifford
  • 88,407
  • 13
  • 85
  • 165
  • Yeah... I can't imagine how `#include` and `#pragma` would be written in any other way – Zaid Nov 23 '09 at 20:48
  • Well you can avoid #include by copying all the content into one file! ;) That is why it is a good thing not a bad thing. #pragma remains bad even when you have to resort to it - for reasons of portability. – Clifford Nov 23 '09 at 21:20
  • 1
    ...or by creating a proper language construct for handling interfaces to other compilation units, rather than relying on this 40 year old preprocessor hack. (Let's not forget the extra `#ifdef` everybody has to put around everything in their included file to get the proper desired behavior of a real interface). – T.E.D. Nov 23 '09 at 22:18
-1

Unlike everybody else, I don't have a big problem with preprocessor directives. The only thing is that using preprocessor defines works better in C than C++. Eg: Win32, OpenGL, zip libraries, jni and many other c libraries use preprocessor directives. Eg: Win32 has "OPAQUE" and "TRANSPARENT" which is passed to their function SetBkMode(HDC,int); Now imagine how easily it is to want to use any of those words. You can't, because C preprocessor doesn't care about namespace. Why isn't there a Cpp preprocessor.

But I know the right tool for job.

user13947194
  • 337
  • 5
  • 7