3

For some kinds of programs I need to use a constant high value to indicate some properties of some variables. I mean let color[i] = 1000000; if the i node in a tree is unexplored. But I quite often miswrite the number of 0s at the end, so I just wondered whether is it better to do it this way:

#define UNEXPLORED 1000000;
color[i] = UNEXPLORED;

I remember that somewhere I have read that it's much better to avoid using #define. Is it right? How would you tackle this problem?

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
gen
  • 9,528
  • 14
  • 35
  • 64
  • 3
    Go through this thread . Best answer from there - `#define's` don't respect scope - http://stackoverflow.com/questions/1944041/advantage-and-disadvantages-of-defines-vs-constants – Mudassir Hasan Mar 05 '13 at 08:14

3 Answers3

5

For simple constants, you can use either const or the new constexpr:

constexpr unsigned int UNEXPLORED = 1000000;

In a case like this, it's no difference between using const and constexpr. However, "variables" marked constexpr are evaluated at compile-time and not at run-time, and may be used in places that otherwise only accepts literals.

Some programmer dude
  • 400,186
  • 35
  • 402
  • 621
  • 1
    Is there any particular reason for using `auto` here? The main use I see of `auto` is to avoid repeating type names, or spelling out complex, nested types. This is neither, and spelling out explicitly the type here improves clarity. (Similarly, why `constexpr`, rather than the traditional `const`?) – James Kanze Mar 05 '13 at 08:39
  • 1
    @JamesKanze - using `auto` here will produce an integral type that's large enough to hold the actual value. `unsigned int` isn't necessarily large enough to hold `1000000`, a problem that the macro doesn't have. – Pete Becker Mar 05 '13 at 12:54
  • @PeteBecker That's an interesting aspect, which I hadn't thought of. Depending on the context, it might be considered a disadvantage or an advantage. (I assume that if `1000000` didn't fit in an `unsigned int`, the compiler will at least warn.) – James Kanze Mar 05 '13 at 13:47
  • @JamesKanze - I know I'm beating a dead horse, but in most contexts, the macro works fine, without needing baroque workarounds. `` – Pete Becker Mar 05 '13 at 13:51
  • @PeteBecker As you've pointed out, the macro does the same thing as `auto`. (Almost: if an inline function does a `push_back` of it into a vector, the `#define` is the only solution which doesn't have undefined behavior. Or the enum constant, but that's really only justified if either you have a lot of them, that you want to group, or you want to limit scope, say in a class.) Using an explicit type ensures that the constant _has_ that type. Which may be useful at times. – James Kanze Mar 05 '13 at 14:04
4

For example use constants.

const unsigned int UNEXPLORED = 1000000;

or enums

enum { UNEXPLORED = 1000000 };
ForEveR
  • 55,233
  • 2
  • 119
  • 133
  • It's getting more monre confusing for me. :( I am trying to learn a specific part and then come thousands of others. :( What's the difference between ENUM and STRUCT? Which to use for what? – gen Mar 05 '13 at 08:21
  • @gen For questions like this, I'd suggest you grab a [good book](http://stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list) to guide your learning. – Angew is no longer proud of SO Mar 05 '13 at 08:26
  • The first doesn't work on systems where `unsigned int` is too small to hold `1000000`. And creating an `enum` just to hold a single constant is icky. `` – Pete Becker Mar 05 '13 at 12:52
0

In the use of constants the two answers above are correct, however #define is not limited to that use alone. Another example of the use of #define is macros.

Macros

Macros are preprocessor-utilised pieces of code, and they work exactly like other #define declarations in that regard. The preprocessor will literally swap out the occurrence of your defined symbol with the code of the macro. An example:

#define HELLO_MAC do{ std::cout << "Hello World" << std::endl; }while(false)

int main(int argc, char** argv)
{
     HELLO_MAC;
}

That will literally swap out the HELLO_MAC symbol with the code I declared. If it were a constant it would do the exact same thing. So you can think of #defines for constants as a particular kind of macro.

With macros you can also pass parameters, and it is especially useful I find for enforcing logging/exception policies over code. For example

#define THROW_EXCEPT( ex_type, ex_msg ) /
    do{ throw ex_type( buildExString( (ex_msg), __LINE__, __FILE__ ) ); }while(false)

... 
// somewhere else
THROW_EXCEPT( std::runtime_error, "Unsupported operation in current state" );

That code allows me to ensure that everyone logs with the line of the file that threw the exception.

Templates are often a better choice instead of macros, but I cannot use template functions for this example because I need to use the __LINE__ and __FILE__ functions from the place of the throw, not from the location of the template function.

Where should you not use macros? Anywhere you can use something else. Macros, like any #define are preprocessed, so the compiler does not see them at all. This means that there is never any symbols created for HELLO_MAC or THROW_EXCEPT, and so they cannot be seen in a debugger. They can also be confusing if you get compile errors, especially if they are long macros.

Dennis
  • 3,683
  • 1
  • 21
  • 43