-2

I have a theoretical question for you ;). In the book I've been learning C++ from (although it's great, it's only in my native language so the title won't specify anything) the author did a comparison between #define and const variables. He was for the second method as it is better for debugging, but one thing that he didn't talk about at all is memory management. Say we have a big amount of constants we wish to define. Of course it can take a lot of memory. To be honest I am still learning and I have never had a need for so many constants, but when I learnt you can for example choose between short, int and long, I started thinking that maybe those few bits make a difference in big programs. So my question is: what do you think about that?

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Fiodor
  • 796
  • 2
  • 7
  • 18

1 Answers1

5

Say we have a big amount of constants we wish to define. Of course it can take a lot of memory

Actually, it probably doesn't. More likely all your constants are hard-coded directly into the expressions that use them, by your clever compiler. In this way, the memory footprint is identical to that of a preprocessor macro.

In cases where this does not happen automatically, there is usually a good reason. You will also typically find that this does not occur when you compile with optimisations disabled, such as for "debug mode builds"; for such builds, it is usually program correctness that you're seeking, not high performance.

when I learnt you can for example choose between 'short', 'int' and 'long', I started thinking that maybe those few bits make a difference in big programs

In some cases, sure. Network transmission of large amounts of data, areas of huge scalability concerns such as Facebook's database... but in general, you don't want to be worrying about this stuff. When you need to, you'll know.

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055