11

I cannot understand what is the difference between:

#define WIDTH 10 

and

int width = 10;

What are the benefits of using the first or the second?

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Rrjrjtlokrthjji
  • 602
  • 2
  • 10
  • 26

6 Answers6

13

Well, there is a great difference. You can change the value of width, you can take its address, you can ask for its size and so on. With WIDTH, it will be just replaced with a constant 10 everywhere, so the expression ++WIDTH doesn't make any sense. Ono the other side, you can declare an array with WIDTH items, whereas you cannot declare an array with width items.

Summing it up: the value of WIDTH is known at compile time and cannot be changed. The compiler doesn't allocate memory for WIDTH. On the contrary, width is a variable with initial value 10, its further values are not known at compile time; the variable gets its memory from the compiler.

Vlad
  • 35,022
  • 6
  • 77
  • 199
  • so the macro WIDTH cannot be changed and it is used to state something "classic" in the program?Like #define AUTHOR "Myself" ?? – Rrjrjtlokrthjji Jun 18 '12 at 08:50
  • 1
    @Nick: the macros can be used in thousands of different ways. You can define macro to contain a piece of code: `#define CHECKSUCCESS(n) if (n) return n; else {}`. The way you propose is a valid usage as well. – Vlad Jun 18 '12 at 08:52
  • @Nick: By the way, with macros you can do weird things, like `#define sin(x) cos(x)`! Use with care. – Vlad Jun 18 '12 at 09:02
8

What is the difference between two?

The first is an Macro while second is an Variable declaration.

#define WIDTH 10 is a preprocessor directive that allows you to specify a name(WIDTH) and its replacement text(10). The preprocessor parses the source file and each occurrence of the name is replaced by its associated text. The compiler never actually sees a macro name at all, what it sees is the replaced text.

The variable declaration is evaluated by the compiler itself. It tells the compiler to declare a variable named width and of the type int and also initializes it with a value 10.
The compiler knows this variable by its own name width.

Which one should you prefer? And Why?

Usually, it is recommended to use compile time constant variables over #define. So Your variable declaration should be:

const int width = 10;

There are a number of reasons for selecting compile time constants over #define, namely:

Scope Based Mechanism:

The scope of #define is limited to the file in which it is defined. So, #defines which are created in one source file are NOT available in a different source file. In short, #defines don't respect scopes.Note that const variables can be scoped.They obey all scoping rules.


Avoiding Weird magical numbers during compilation errors:

If you are using #define those are replaced by the pre-processor at time of precompilation So if you receive an error during compilation, it will be confusing because the error message wont refer the macro name but the value and it will appear a sudden value, and one would waste lot of time tracking it down in code.


Ease of Debugging:

Also for same reasons mentioned in #2, while debugging #define would provide no help really.

Alok Save
  • 202,538
  • 53
  • 430
  • 533
  • The question is tagged `c`, are you sure about namespaces and classes? – Vlad Jun 18 '12 at 08:54
  • 1
    One advantage of a DEFINE when programming in embedded systems is that a DEFINE will never consume memory where a variable will have to be stored somewhere. – RedX Jun 18 '12 at 08:55
  • 1
    I thought namespaces and classes are used in C++ .Nice answer though – Rrjrjtlokrthjji Jun 18 '12 at 08:55
  • @RedX: well, _theoretically_ the variable which is never modified ought to be optimized by the compiler to take no memory. But the embedded compilers are known to be quite weak in optimizations. – Vlad Jun 18 '12 at 08:56
  • Ahm, Modified the answer to sit the C tag.I didn't see the tags before. – Alok Save Jun 18 '12 at 08:59
  • @Vlad because of the very simple optimizations done by most embedded compilers i thought i'd specifically point that out. – RedX Jun 18 '12 at 10:46
  • I'd argue that pre-processing directives also are a bit safer than variables. Considering that you don't apply any obfuscation method, variables are identifiers, suppose `apiKey`, that is human-readable, while the pre-processing directives will replace the reference directly, so it's a bit harder to search for. – henrique Jun 10 '21 at 13:04
2

WIDTH is a macro which will be replaced with the value (10) by the preprocessor whereas width is a variable.

When you #define a macro (like WIDTH here), the preprocessor will simply do a text-replacement before the program is passed to the compiler. i.e. wherever you used WIDTH in your code, it'll simply be replaced with 10.

But when you do int width=10, the variable is alive

P.P
  • 117,907
  • 20
  • 175
  • 238
0

First a short background: before getting compiled, a C file is pre-processed. The pre-processor checks for #include and #define statements.

In your case, that #define statement tells the pre-processor to change every string WIDTH in your source code with the string 10. When the file gets compiled in the next step, every occurance of WIDTH will be in fact 10. Now, the difference between

#define WIDTH 10 

and

int width = 10;

is that the first one can be seen as a constant value, whereas the second is a normal variable whose value can be changed.

0

One #define is handled by the preprocessor, if finds WIDTH in the source code and it replaces it with 10, all it does is basic substitution among other things, the other int width = 10; is handled by the compiler, this will create entries in the look up table, generate binaries to allocate enough memory on the stack, depending where its define, and copy the value 10 to that memory location.

So one is nothing more than a label for a constant, the other is a variable at run time.

You can use preprocessors for faster execution, since variables need to be allocated on the stack, at the cost of not being mutable at run time.

You usually use preprocessors for things that don't need to change at run time, though be careful preprocessors can be a bit tricky to debug, since they can actually manipulate the source code before its handed of to the compiler, leading to very subtle bugs, that may or may not be apparent examining the source code.

Samy Vilar
  • 10,800
  • 2
  • 39
  • 34
0

Define is like a static global definition scope. It is not to be changed or over-written as a normal variable.

Eric Aya
  • 69,473
  • 35
  • 181
  • 253
  • (This post does not seem to provide a [quality answer](https://stackoverflow.com/help/how-to-answer) to the question. Please either edit your answer, or just post it as a comment to the question). – sɐunıɔןɐqɐp Jul 27 '18 at 07:02