I have a large number of data points that I need to perform the same calculations on, roughly 900,000. Each data point is independent of the next. I'm looking for the fastest way to do all the calculations for each. Some crude testing I did compare similar operations using arrays with for loops compared to streams with a map on an ArrayList was not very conclusive, but it appeared that the for loop was faster.
My dev box is very different from the deployed environment as well, so I am not even sure doing it one way on my dev box will result in similar gains in the deployed environment.
I am limited to Java 8.
An example of what I want to do is: Series of two data points obj.a and obj.b Add constant to obj.a + obj.b
With array and for loop
double[] dp1 = ...;//some source
double[] dp2 = ...;//some source
double[] result=new double[dp1.length];
for (int i =0;i < dp1.length;i++){
result[i]=dp1[i]+dp2[i]+myconst;
}
With stream
ArrayList<MyObj> source = ...// some source of obj with dp1 and dp2
ArrayList<Double> result = source.stream().parallel().mapToDouble(a->a.dp1+a.dp2+myconst);
In general, I understand there is overhead with ArrayList which may lead to a minimum size of the Array before seeing gains. Also, if there is a better way to do the stream operations or maybe iterate. Not sure if forEach works in this case as it is a terminal operation and I need to capture the results IN ORIGINAL ORDER of the array/ArrayList.