6

I've only been working with C++ for 2~3 months and recently I found out about the identifier, final, that comes after a virtual function. To this day, I believed that omission of virtual will stop the propagation of virtualness but I was wrong. It implicitly propagates.

My question is this. Why allow implicit propagation? Why can't an existence of virtual make a function virtual and absense of virtual make a function not virtual? Is is better in some circumstance? or Was it, back in the day when virtual was first introduced?


According to Clifford's answer, there even is a compiler that generates warning upon absense of virtual.


why is the virtuality of methods implicitly propagated in c

I expected above link to answer my question but it doesn't.

------------ Addition -------------

There are comments about asking the usefulness of this feature. I think final keyword on virtual function is what devirtualizes the function. The function can no longer by overridden, so a derived class must re-declare a function whether it has a same name or not. If final is different from devirtualization, Help me to understand it. If final is not different, then usefulness of devirtualization is self-evident from the fact final was introduced. I agree that forcing explicit virtual will produce bugs, but I'm curious if there are other reasons.

Community
  • 1
  • 1
Sgene9
  • 156
  • 9
  • once it's declared virtual at some point, all methods matching the signature are stored in the virtual table, virtual keyword or not. Isn't `final` for Java? Or it is C++11 and I missed something? – Jean-François Fabre Aug 21 '16 at 07:54
  • 1
    It's not at all obvious how stopping the propagation would be useful. Do you have a convincing example? (Note that `final` is a much stronger thing than "stop virtual propagation here".) – molbdnilo Aug 21 '16 at 08:10
  • @Jean-FrançoisFabre `final` is C++11, so you're lagging behind by quite a few years. ([This](http://en.cppreference.com/w/) is useful.) – molbdnilo Aug 21 '16 at 08:12
  • The basic answer to your question is 'because Bjarne Stroustrup'. – user207421 Aug 21 '16 at 08:30
  • Alternatively, you can blame the immigrants. – Karoly Horvath Aug 21 '16 at 08:35
  • And even Bjarne didn't come up with this himself, but borrowed from [the Simula language](https://en.wikipedia.org/wiki/Simula), which had classes, inheritance, and this rule for virtual functions already in the 1960's. So one answer could be *It has always been this way*. – Bo Persson Aug 21 '16 at 08:42
  • @molbdnilo Basically, where ever I would like to prevent it from being overridden. I don't see how a **final** as function identifier, not as class identifier, is different from "stop virtual propagation here" thus preventing override. – Sgene9 Aug 22 '16 at 12:07
  • 1
    @Sgene9 "Wherever I would like to" is not an example of usefulness, much less a convincing one. Post a concrete situation and explain how this would solve an actual problem. – molbdnilo Aug 22 '16 at 12:50
  • 1
    It'd be pretty confusing if `vehicle->go()` invoked `Car::go()` but not `Toyota::go()` – M.M Aug 23 '16 at 00:47
  • 2
    To allow subclasses to make the same function non-virtual would be to violate the liskov substitution principle. In doing this you force subclasses to introduce special hacks to check which type something is before acting on it and introduce unnecessary complexity and opportunity for error. Imagine if you passed your object to a function by reference to a base class and as a result a different function was called than what you expected. It would be chaos! – Paul Rooney Aug 23 '16 at 01:21
  • @molbdnilo I gave more clarification in the question by editing it. – Sgene9 Aug 23 '16 at 07:23
  • @PaulRooney wouldn't declaring some **virtual** function as **final** violate the principle, too? Say, A has a virtual function **foo** and B which inherits A declares **foo** as **final**. C which inherits B declares another function named **foo**. Are A and C interchangeable? – Sgene9 Aug 23 '16 at 09:16
  • 1
    It wouldn't be able to declare foo with the same signature. That's the point of it. It could declare it with a different signature but then that wouldn't be overriding. – Paul Rooney Aug 23 '16 at 11:01
  • @PaulRooney If you gave an answer with the content you provided me, I would've checked the answer. Basically my understanding is that **final** doesn't "de-virtualize". **final** disallows derived classes to declare a function with the same signature, which results in disallowing override. Since you showed me that **final** is different from "de-virtualize", my question is now answered. – Sgene9 Aug 24 '16 at 16:06

1 Answers1

8

The answer as to why a particular feature exists (or doesn't) is usually rather difficult, as it becomes a matter of guessing and opinions. However, the simple answer could be the principle of least astonishment. Coming up with a scheme that makes sense and works reliably and predictably would be difficult.

What would "devirtualizing" a function even mean? If, at runtime, you're calling a "devirtualized" function on an object, would it use the static type of the pointer instead? If the static type has a virtual function but the runtime type doesn't, what happens?

#include <iostream>

struct A     {  virtual void f() const { std::cout << "A"; }  };
struct B : A {          void f() const { std::cout << "B"; }  };
struct C : B {  virtual void f() const { std::cout << "C"; }  };
struct D : C {          void f() const { std::cout << "D"; }  };

void f(const A& o) { o.f(); }

int main()
{
                // "devirtualized"     real C++

    f(A{});     // "A"                 "A"
    f(B{});     // "A" or "B"?         "B"
    f(C{});     // "C"?                "C"
    f(D{});     // oh god              "D"
}

There's also the fact that for the vast majority of designs, a virtual function has to stay virtual in the whole hierarchy. Requiring virtual on all of them would introduce all sorts of bugs that would be very hard to diagnose. C++ usually tries to stay away from features that require discipline to get right.

isanae
  • 3,253
  • 1
  • 22
  • 47
  • 1
    "C++ usually tries to stay away from features that require discipline to get right"... well, until you start learning more of the language, or you come into it from the C world. Then that breaks down just a little :P – Sebastian Lenartowicz Aug 23 '16 at 01:42
  • @SebastianLenartowicz RAII, smart pointers, tighter type system, exceptions, variadic templates, the list goes on. What on earth are you talking about? – isanae Aug 23 '16 at 01:44
  • I mean, true enough. I'm just of the opinion that C++ (which, don't get me wrong, I love) has a lot of little "gotchas" if you're not prepared for them. – Sebastian Lenartowicz Aug 23 '16 at 01:55
  • @isanae f( A ) = "A" f( B ) = "A" f( C ) = "C" f( D ) = "C" struct A declared a virtual function, struct B inherited A, struct C inherited B but since B::f() is not virtual, C declared another virtual function, struct D inherited C – Sgene9 Aug 23 '16 at 07:25
  • struct A { virtual void f() const { std::cout << "A"; } }; @isanae struct B : A { virtual void f() final const { std::cout << "B"; } }; struct C : B { virtual void f() const { std::cout << "C"; } }; struct D : C { virtual void f() final const { std::cout << "D"; } }; Basically, I think your code with my idea should behave like the above – Sgene9 Aug 23 '16 at 07:29
  • @Sgene9 This makes no sense to me. – isanae Aug 24 '16 at 01:41
  • @isanae I'm sorry that I asked an obscure question and asked for clear answer. You might want to check out Paul Rooney's comments under the Question, that guided me into getting my question solved. – Sgene9 Aug 24 '16 at 16:11
  • @Sgene9 Your question was "Why allow implicit propagation?" as a concept, not the more technical "Does `final` stop propagation?" My answer to the latter would have been very different. – isanae Aug 24 '16 at 16:15
  • @isanae You are right. The question was based on my belief that **final** stops propagation. Sorry for the mistake. – Sgene9 Aug 26 '16 at 06:28
  • I think this answer doesn't really test the right (wrong) assumptions that people may have, because `void f(const A& o)` always calls via an `A` type, hence always calls a virtual method. More interesting would be what `void f(const B& o)` would output. Until today, I thought that this would simply output `"B"` no matter what it would be called with. – Felix Dombek Apr 25 '18 at 15:43