What's faster?
List<E> bar = new ArrayList<>();
pan.stream() /* other functions */.forEach(bar::add);
or
List<E> bar = pan.stream() /* other functions */.collect(Collectors.toList());
What's faster?
List<E> bar = new ArrayList<>();
pan.stream() /* other functions */.forEach(bar::add);
or
List<E> bar = pan.stream() /* other functions */.collect(Collectors.toList());
I've tested these two scenarios for a list of size 1 mln. Overall there is almost no difference for a sequential stream but there is a difference for a parallel stream:
Benchmark Mode Cnt Score Error Units
Performance.collect avgt 200 0.022 ± 0.001 s/op
Performance.forEach avgt 200 0.021 ± 0.001 s/op
Performance.collectParallel avgt 200 0.124 ± 0.004 s/op
Performance.forEachParallel avgt 200 0.131 ± 0.001 s/op
In my opinion you shouldn't make a list using forEach
because it breaks function purity rule and also collect is more efficient while using with a parallel stream.
@Benchmark @BenchmarkMode(Mode.AverageTime)
public void collect(Blackhole blackhole) {
Stream<Double> stream = Stream.iterate(0.0, e -> Math.random());
List<Double> list = stream.limit(1000000).collect(Collectors.toList());
blackhole.consume(list);
}
@Benchmark @BenchmarkMode(Mode.AverageTime)
public void forEach(Blackhole blackhole) {
Stream<Double> stream1 = Stream.iterate(0.0, e -> Math.random());
List<Double> list = new ArrayList<>();
stream1.limit(1000000).forEach(e -> list.add(e));
blackhole.consume(list);
}
@Benchmark @BenchmarkMode(Mode.AverageTime)
public void collectParallel(Blackhole blackhole) {
Stream<Double> stream = Stream.iterate(0.0, e -> Math.random());
List<Double> list = stream.parallel().limit(1000000).collect(Collectors.toList());
blackhole.consume(list);
}
@Benchmark @BenchmarkMode(Mode.AverageTime)
public void forEachParallel(Blackhole blackhole) {
Stream<Double> stream1 = Stream.iterate(0.0, e -> Math.random());
List<Double> list = Collections.synchronizedList(new ArrayList<>());
stream1.parallel().limit(1000000).forEach(e -> list.add(e));
blackhole.consume(list);
}