1

I have just been using the #define to change my values ex:

    #include <iostream>
    #include <string>

    #define Changed_num 100

    using namespace std;

    int main ()

    {
        cout<< Changed_num<< endl;
    }

But i heard there's a better way to do that without #define ( what's the other way?)

so i leave me to ask, why would you need #define then? what's the most common use for it?

LostPecti
  • 121
  • 4
  • 1
    It [should not be used that way](http://stackoverflow.com/questions/4767667/c-enum-vs-const-vs-define). Use `constexpr int changed_num = 100;` – Nawaz Jun 17 '14 at 05:16
  • If you are just defining a number it is better to use constant. #define is really powerful but has it's flaws. So you have to be careful when using it – santahopar Jun 17 '14 at 05:16
  • 1
    @armanali should not it be other way around: it has flaws but is really powerful? :) – Alexei Levenkov Jun 17 '14 at 05:21
  • 1
    I find it is most often used to make code harder to read and debug. :) – Retired Ninja Jun 17 '14 at 05:33

3 Answers3

3

The C++ equivalent of #define in the shown scenario is const. Internally #define is used not by the compiler but the precompiler, to modify code before it is actually sent to the compiler. Hence in your case the compiler actually only ever gets to see cout<<100<<endl;. Using const is better practice because it is actually type safe.

The most common use case for #define in C++ is for include guards, to ensure header files are only included once for any specific object.

Runner-up is most likely for platform-specific compilation/optimization, see this link for some examples.

Community
  • 1
  • 1
Niels Keurentjes
  • 41,402
  • 9
  • 98
  • 136
2

In general: yes you should try to avoid #define (see C++ FAQ: Are you saying that the preprocessor is evil? )

The reason is that #defines are handled by the preprocessor, this is at the lexical level (globally), so the control you have on the effect your #define replacements is not so good and it is likely to backfire (e.g. substitutions brought in because you included a header that included a header that included a header with a #define that causes your code to misbehave or simply not to compile).

What you should use?

  • For constants, well, constants (e.g. const int Changed_num=100) either variables or static class attributes. (see: Why would I use a const variable / const identifier as opposed to #define? ). Check also the constexpr new keyword in C++11.
  • For code. You should try to write code (you know) classes, abstractions... etc. instead of cut and pasting :D (which is more or less what a #define is). In some cases where macro parameters are being used you can try to replace it with templated code.... in some other cases you would still need to use macros
  • For conditional compilations (suggested by @James Kanze): You can use different versions of the header files, placed in different include directories (and selected at compilation time with different -I options).

When you should still use #define

  • include guards
  • In some occasions, code (but be very much aware of the downsides).
  • Conditional compilations, e.g. multi-platform builds (again, be very much aware of the downsides) [but consider the option of using different include directories described above]
jsantander
  • 4,972
  • 16
  • 27
  • Thanks. you answered both of my question is a way a beginner can understand... Thanks everyone who answered or commented :) – LostPecti Jun 17 '14 at 05:35
  • Using `#define` and conditional compilation is _not_ a good idea for platform dependencies. The _only_ `#ifdef` in your code should be the include guards. Platform dependencies are better handled by using separate functions, in separate include files, in a path determined by the `-I` options. – James Kanze Jun 17 '14 at 08:36
  • @JamesKanze I added your suggestion. However I still keep the possibility of using #defines. I feel there are still be cases that will not be possible with that option or it will represent a lot of less hassle. – jsantander Jun 17 '14 at 08:45
1

For constants values, using const int Changed_num = 100; has the advantage over #define Changed_num 100 in that you can assign a type. For example, you can const unsigned long Changed_num = 100, which is a bit tricky to declare as a #define. (You can do something like #define Changed_num 100ul, but it's not as obvious.)

One possible use for #define as part of logging macros, such as boost::log. They have the advantage of being to interpolate things like __FILE__ and __LINE__ at the point they're called. They are also used for code generation, such as with boost::foreach (which has been supplanted by the range-based for in c++11) or boost::python. In general, however, you're better off using [templated] functions so that you get proper type safety.

The main disadvantage of #define is that it's a super heavy hammer. If you #define something, you can't override it later with a local variable or function. One particularly egregious case is Windows.h which #defines min and max, giving compiler errors if you try to use std:::min unless you set NOMINMAX.

Community
  • 1
  • 1
seeker
  • 1,136
  • 1
  • 13
  • 16
  • +1 / re "prefer that my constants be named more like `CHANGED_NUM`." - good suggestion if it remains a `#define`, but not good if it's changed to `const` (or `constexpr` / `enum`). Common practice is to reserve multicharacter all-uppercase identifiers for preprocessor definitions. – Tony Delroy Jun 17 '14 at 05:48
  • 1
    @TonyD Fair enough. I just removed "As a side note, I prefer that my constants be named more like CHANGED_NUM." from my answer. I do like to make my constants stand out visually, but I guess the modern way of doing this is now different. (The Google style guide apparently recommends kChangedNum.) – seeker Jun 17 '14 at 06:04