#define
by itself is not bad, but it does have some bad properties to it. I'll list a few things that I know of:
"Functions" do not act as expected.
The following code seems reasonable:
#define getmax(a,b) (a > b ? a : b)
...but what happens if I call it as such?:
int a = 5;
int b = 2;
int c = getmax(++a,b); // c equals 7.
No, that is not a typo. c
will be equal to 7. If you don't believe me, try it. That alone should be enough to scare you.
The preprocessor is inherently global
Whenever you use a #define
to define a function (such as stop()
), it acts across ALL included files after being discovered.
What this means is that you can actually change libraries that you did not write. As long as they use the function stop()
in the header file, you could change the behavior of code you didn't write and didn't modify.
Debugging is more difficult.
The preprocessor does symbolic replacement before the code ever makes it to the compiler. Thus if you have the following code:
#define NUM_CUSTOMERS 10
#define PRICE_PER_CUSTOMER 1.10
...
double something = NUM_CUSTOMERS * PRICE_PER_CUSTOMER;
if there is an error on that line, then you will NOT see the convenient variable names in the error message, but rather will see something like this:
double something = 10 * 1.10;
So that makes it more difficult to find things in code. In this example, it doesn't seem that bad, but if you really get into the habit of doing it, then you can run into some real headaches.