The reason the reduce included above doesn't work is because 'The identity value must be an identity for the combiner function. This means that for all u, combiner(identity, u) is equal to u.'
'dummy' is not a valid identity for String concatenation because dummy + value is not equal to value. Replacing with something like this should work:
public static void main(String args[])
{
Stream<String> data = Stream.of("a", "b", "c", "d", "e", "f");
System.out.println(data.parallel().reduce("", Test::accumulate));
}
private static String accumulate(String v1, String v2)
{
if(v1.isEmpty()) return v2;
if(v2.isEmpty()) return v1;
return v1 +":" + v2;
}
As an alternative, assuming 'dummy' is only included because reduce expects an initial state to be provided, something like this may work for your purposes:
data.parallel().reduce((v1, v2) -> v1 +":" + v2).orElse("???");
or
data.parallel().collect(Collectors.joining(":"));
Edit - Why is reduce behaving like this?
Certain assumptions need to be made when aggregating data concurrently. For example, the order in which elements in the stream are merged should not impact the validity of the result. To address this there are certain limitations on the types of functions which may be used in reduction operations. One limitation is:
'The identity value must be an identity for the combiner function. This means that for all u, combiner(identity, u) is equal to u.'
As such, at least we can say there is no reason functionally why the identity cannot be passed to all elements when executing in parallel. In terms of why considering the following may be useful:
- What does it mean from a computation perspective if only the first element allowed to be combined with the identity?
- What if I want to start processing the second element immediately also?
(See this answer for more detail: https://stackoverflow.com/a/51290673/14294525)