0

For example declaring a reference without assigning it to something gives a compiler error eg. int &refVar; but this doesn't holds true for pointers. I get that c++ skips some sane default features like skipping bound checks etc to gain speed but this theory dosen't fits well with pointer variables.

Why would someone declare a pointer variable and not initialize it to something? And why not to NULL/0 or even nullptr? Most static analyzers actually point out this problem, but what could be the possible reason to define this undefined behaviour? Isn't this a great source of bugs?

  • 3
    To be fair, zero-initializing pointers does have an impact on performance, albeit tiny. – Frédéric Hamidi Jan 20 '16 at 14:28
  • @VioletGiraffe http://ideone.com/DCptGZ –  Jan 20 '16 at 14:29
  • 1
    @oopaewem That *is* default-initialization (it's a well-defined term in the language). Your question is "Why does default-initialization do no initialization in some cases". – TartanLlama Jan 20 '16 at 14:32
  • 3
    Go back 45 yrs to K&R PDP11 ran at 1MHz or so, then 30yrs to C++. PC ran at 4.77MHz. C and C++ were designed as portable assemblers, so speed was king. There are also many times I use a pointer because I don't know where I want it to reference. Initializing it to anything is pointless e.g. I will use a factory method to decide what I want it to be. In those days, initialization might also incur a malloc and default constructor, compilers weren't what they are now. – ChrisR Jan 20 '16 at 14:34

4 Answers4

3

I think most likely it is due to possible overhead of initializing these variables. This might not be such a big problem for one variable, but imagine arrays allocated on the stack.

Giorgi Moniava
  • 27,046
  • 9
  • 53
  • 90
  • 1
    In fact, C++ has not penetrated to embedded systems even to this day because of real or imagined performance degradation. C has only recently been adopted now that 32 bit architectures are available. The 8 bit embedded world is still largely written in assembly. - yes assembly is fast becoming legacy for all but the most demanding apps. – ChrisR Jan 20 '16 at 14:45
1

The reason is performance. Imagine a situation where you know you need a pointer, but it points to something different depending on a run-time variable.

int* ptr;

cin >> something;
switch(something)
{
    case 0: ptr = &a; break;
    case 1: ptr = &b; break;
    default: ptr = &c; break;
}

Here, default-initializing ptr could introduce an unnecessary run-time overhead.

Vittorio Romeo
  • 90,666
  • 33
  • 258
  • 416
  • 3
    This is probably false in practice. Most *optimizing* compilers would skip the initialization, so no runtime overhead for your example in practice. – Basile Starynkevitch Jan 20 '16 at 14:30
  • They probably would, as they would for any other type of variable. The point is that default-initialization is not the default to allow compilers to optimize more aggressively and to avoid introducing unnecessary (albeit insignificant in practice) initialization overhead. – Vittorio Romeo Jan 20 '16 at 14:31
  • @BasileStarynkevitch, I just went through a related issue with MSVC, where someone on another team set the build to treat use of an uninitialized variable as error rather than warning. GCC, ICC and others could easily see the variable was used under identical conditions to it being assigned. But it needed to be declared (uninitialized) outside the conditions where it was assigned or used. MSVC could not see that the use condition matched the assign condition, so it wanted initialization which it would not have optimized out. – JSF Jan 20 '16 at 14:34
0

Very probably, because C++ wants to be very compatible with C (or at least, old versions of C++ wanted to be compatible with C99).

BTW (as a personal coding rule, which can be discussed), I try to initialize any scalar variable in C and in C++. If that initialization is unneeded, the optimizing compiler would remove it. Otherwise, it contributes to give a more deterministic behavior to my program.

Basile Starynkevitch
  • 223,805
  • 18
  • 296
  • 547
0

It is for historic reasons only. Back in the days, when compiler optimizations were rudimentary if any, and C++ was just introduced, it's authors needed to make sure C++ code is as fast as C code, when no C++ features are used. Otherwise, no one would ever migrate to it.

So they made this rule to follow C suite - there is no default initialization for automatic variables (for instance, statics are still default initialized). Nowadays, of course, it would be easy to require an initializer for every scalar and default-initialize it unless initialized before use later in the code, but no one has the energy to do so.

SergeyA
  • 61,605
  • 5
  • 78
  • 137