Considering these classes and accumulation function, which represent a simplification of my original context (yet reproducing the same problem):
abstract static class Foo {
abstract int getK();
}
static class Bar extends Foo {
int k;
Bar(int k) { this.k = k; }
int getK() { return this.k; }
}
private static Foo combined(Foo a1, Foo a2) {
return new Bar(a1.getK() + a2.getK());
}
I have attempted to perform an accumulation of items (originally data indexing reports) by relying on a separate function, combined
, which deals directly with elements of type Foo
.
Foo outcome = Stream.of(1,2,3,4,5)
.map(Bar::new)
.reduce((a,b) -> combined(a, b))
.get();
It turns out that this code results in a compilation error (OpenJDK "1.8.0_92"): "Bad return type in lambda expression: Foo cannot be converted to Bar". The compiler insists on attempting to reduce the stream using Bar
as the accumulative element, even when there is Foo
as a common type for both the arguments to the cumulative function and its return type.
I also find peculiar that I can still take this approach as long as I explicitly map the stream into a stream of Foo
s:
Foo outcome = Stream.of(1,2,3,4,5)
.<Foo>map(Bar::new)
.reduce((a,b) -> combined(a, b))
.get();
Is this a limitation of Java 8's generic type inference, a small issue with this particular overload of Stream#reduce
, or an intentional behaviour that is backed by the Java specification? I have read a few other questions on SO where type inference has "failed", but this particular case is still a bit hard for me to grasp.