3

I've tried to figure out this behavior of the define command in C (I'm new with that). I've got this code and I don't know why I see in the output that myAge=15 and not 16 (I know it's 15, but I don't know why). Anybody can help me to find out why does it happen?

this is the code:

#include <stdio.h>
#include <stdlib.h> 
#define AGE 15; 

int main(void)
{
    float myAge = AGE + 1;
    printf("Hello!\n");  
    printf("My name is Raz and I am %d years old!\n", myAge);

    system("PAUSE");

    return 0;
}

Thanks for helping :)

Raz Omry
  • 255
  • 1
  • 6
  • 18
  • See [`static const` vs `#define` vs `enum`](http://stackoverflow.com/questions/1674032/static-const-vs-define-vs-enum/) for a run-down of the relative merits of different ways of defining constant values. – Jonathan Leffler Nov 19 '16 at 17:03

1 Answers1

5

#define is a textual replacement performed by the preprocessor prior to the compilation step. In this case, you're asking the preprocessor to expand the token AGE to 15;. The semicolon is part of the expansion, so this is the code you would get after the preprocessing step:

float myAge = 15; + 1;

As you can see, it does not expand to what you expect.

You can fix this issue by removing the semicolon from the #define:

#define AGE 15

Better yet, avoid using the preprocessor for simple numerical constants - consider using a const int instead:

const int age = 15;
Vittorio Romeo
  • 90,666
  • 33
  • 258
  • 416
  • Is there a benefit in avoiding using the preprocessor? I thought that `#define`ing an integer caused it to be calculated at compile time (thus faster at runtime), whereas a `const` wouldn't be. – Addison Nov 19 '16 at 12:59
  • thank you about that. that is very useful for me! :) – Raz Omry Nov 19 '16 at 13:05
  • 7
    *"As you can see, it is invalid C code."* – Actually it is *valid* C code (otherwise it would not compile and produce the unexpected output). – Martin R Nov 19 '16 at 13:08
  • 2
    @Addison - you are incorrect, In this case and many others using `const int...` is a lot better. You have the power of type safety as well – Ed Heal Nov 19 '16 at 13:08
  • @EdHeal - I'm willing to accept that I'm wrong, and you raise a good point about type safety, but I will not accept "using `const int...` is a lot better" as a reason. Just why is it a lot better? – Addison Nov 19 '16 at 13:11
  • @EdHeal - A quick search showed up some reasons as to why `consts` are preferred: you can use pointers to consts, unlike with defined values, and you can give them the appropriate scope to improve code readability and minimise abuse of the variable. Additionally, const values are visible to debugging tools. – Addison Nov 19 '16 at 13:18
  • @addison - I think you have just answered your own question. – Ed Heal Nov 19 '16 at 13:22
  • Yep. I suppose they are "a lot better", but it's good that I know why – Addison Nov 19 '16 at 13:23
  • @addison - At least you know and by doing a little research you understand why – Ed Heal Nov 19 '16 at 13:24
  • @MartinR: fixed, thanks. – Vittorio Romeo Nov 19 '16 at 13:25