3

Given the following code (in GCC 4.3) , why is the conversion to reference called in both cases?

class A { };

class B {
public:
  operator A() {}
  operator A&() {}
};

int main() {
  B b;
  (A) b;
  (A&) b;
}

http://ideone.com/d6iF8

Luchian Grigore
  • 253,575
  • 64
  • 457
  • 625
Yam Marcovic
  • 7,953
  • 1
  • 28
  • 38

1 Answers1

8

Your code is ambiguous and should not compile (it is ill-formed per 13.3.3:2).

lvalue-to-rvalue conversion has the same rank as identity conversion, so (per 13.3.3:1) there is no way to choose between them.

Comeau C++ (probably the most standards-compliant compiler) gives the following error:

"ComeauTest.c", line 11: error: more than one user-defined conversion from "B" to
          "A" applies:
            function "B::operator A()"
            function "B::operator A &()"
    (A) b;
        ^

Here's the relevant text from the standard:

13.3.3 Best viable function [over.match.best]

[...] Given these definitions, a viable function F1 is defined to be a better function than another viable function F2 [...]

2 - If there is exactly one viable function that is a better function than all other viable functions, then it is the one selected by overload resolution; otherwise the call is ill-formed.

The definitions themselves are complicated, but there's two things to note with user-defined conversions:

First, the application of user-defined conversion as a conversion sequence is specified to decompose into a sequence S_a - U - S_b of a standard conversion sequence followed by a user-defined conversion followed by another standard conversion sequence. This covers all the cases; you can't have more than one user-defined conversion in a conversion sequence, and a standard conversion sequence can be the "identity conversion" i.e. no conversion required.

Second, when comparing user-defined conversion sequences the only part that matters is the second standard conversion sequence. This is in 13.3.3:

13.3.3 Best viable function [over.match.best]

[...] a viable function F1 is defined to be a better function than another viable function F2 if [...]

  • the context is an initialization by user-defined conversion (see 8.5, 13.3.1.5, and 13.3.1.6) and the standard conversion sequence from the return type of F1 to the destination type (i.e., the type of the entity being initialized) is a better conversion sequence than the standard conversion sequence from the return type of F2 to the destination type.

and in 13.3.3.2:

13.3.3.2 Ranking implicit conversion sequences [over.ics.rank]

3 - Two implicit conversion sequences of the same form are indistinguishable conversion sequences unless one of the following rules applies: [...]

  • User-defined conversion sequence U1 is a better conversion sequence than another user-defined conversion sequence U2 if they contain the same user-defined conversion function or constructor or aggregate initialization and the second standard conversion sequence of U1 is better than the second standard conversion sequence of U2.

So when comparing conversion sequences U1 = (S1_a - U'1 - S1_b) and U2 = (S2_a - U'2 - S2_b) the only thing that matters is the relative rank of S1_b and S2_b; the standard conversion sequences required to arrive at the parameter of the user-defined conversions do not matter.

So the possible conversion sequences for (A) b, requiring a conversion sequence yielding B -> A, are:

U1: B -> B [identity], B::operator A() [user-defined], A -> A [identity]
U2: B -> B [identity], B::operator A &() [user-defined], A & -> A [rvalue-to-lvalue]

Now, how do we rank standard conversion sequences? The place to look is table 12 in 13.3.3.1.1, which specifies that lvalue-to-rvalue conversion has the same rank ("Exact Match") as identity conversion. So the two user-defined conversion sequences cannot be distinguished, and the program is ill-formed.


Sidebar

What's the difference between 13.3.3 and 13.3.3.2 as regards ranking user-defined conversion sequences?

13.3.3 allows the compiler to distinguish between different user-defined conversion operators; 13.3.3.2 allows the compiler to distinguish between different functions that each require a user-defined conversion in their arguments.

So, in the code

struct A {
    operator int();
    operator float();
} a;
void f(int);
f(a);

13.3.3 applies and A::operator int() is selected over A::operator float(); in the code

struct A {
    operator int();
} a;
void f(int);
void f(double);
f(a);

13.3.3.2 applies and void f(int) is selected over void f(double). However in the code

struct A {
    operator int();
    operator float();
} a;
void f(int);
void f(double);
f(a);

even though 13.3.3 prefers A::operator int() -> void f(int) over A::operator float() -> void f(int) and float -> double over int -> double, and 13.3.3.2 prefers int -> int over int -> double and float -> double over float -> int, there is no way to distinguish between the int -> int and float -> double conversion sequences (because they contain neither the same user-defined conversion operator nor the same overload of f), and so the code is ill-formed.

ecatmur
  • 152,476
  • 27
  • 293
  • 366
  • 1
    clang trunk and gcc 4.7.1 allow the code, the comeau online front-end fails with the error your explanation would require. – pmr Jul 18 '12 at 16:06
  • @YamMarcovic without looking at the source of the compilers, I'd speculate that they're erroneously ranking identity conversion above lvalue-to-rvalue conversion. This seems to be a common problem; see http://stackoverflow.com/questions/11465999/precedence-between-conversion-operators-in-c for an instance where compilers erroneously rank identity conversion above qualification conversion. – ecatmur Jul 18 '12 at 16:42
  • It would be nice, if you could give relevant quotes from the paragraphs you cite and explain what happens. Conversions are a mystery to me. – pmr Jul 18 '12 at 16:56
  • `A & -> A` [lvalue-to-rvalue]. I want to ask a simple question: Does `cv A& -> A` is [lvalue-to-rvalue]` also? – mada Aug 20 '22 at 14:46