I currently really can't wrap my head around what is happening here. I have the following (minimal) code:
public class SillyCast {
interface B {}
interface D<T extends B> {}
class C<T extends D<?>> {}
class A<T extends B> extends C<D<T>> {}
List<Class<? extends C<?>>> list = new ArrayList<>();
void m() {
list.add((Class<? extends C<?>>) A.class);
}
}
When compiling (and running) this code in Eclipse everything works as expected but when doing it from the command line (Oracle JDK) I get the following error during compiling:
error: incompatible types: Class<A> cannot be converted to Class<? extends C<?>> list.add((Class<? extends C<?>>) A.class);
I know that the Oracle JDK not always behaves exactly the same as Eclipse JDT but this seems weird.
How can I add an instance of A.class
into my ArrayList? Intuitively it seems like this should be possible.
Why does the Oracle JDK and Eclipse JDT have different opinions about this cast? Why does the Oracle JDK have problems doing this cast.
At first I thought this might be a bug but I tested this on different Java versions (8 and 11) and the problem happens on all of them. Also this kind of cast seems way too "simple" as this could go unnoticed over multiple versions, so expect this to be done like this by design.