After reading question Why is processing a sorted array faster than an unsorted array? We had tried to make variables as volatile (I expected that, when I use volatile it must be working slower, but it's working faster) Here is my code without volatile: (It is working about 11 sec.)
import java.util.Arrays;
import java.util.Random;
public class GGGG {
public static void main(String[] args) {
int arraySize = 32768;
int data[];
data = new int[arraySize];
Random rnd = new Random(0);
for (int c = 0; c < arraySize; ++c) {
data[c] = rnd.nextInt() % 256;
}
Arrays.sort(data);
long start = System.nanoTime();
long sum = 0;
for (int i = 0; i < 200000; ++i) {
for (int c = 0; c < arraySize; ++c) {
if (data[c] >= 128) {
sum += data[c];
}
}
}
System.out.println((System.nanoTime() - start) / 1000000000.0);
System.out.println("sum = " + sum);
System.out.println("=========================");
}
And output is:
10.876173341
sum = 310368400000
=========================
And this is when I use arraySize and data variables as volatile, and it is working about 7 seconds:
import java.util.Arrays;
import java.util.Random;
public class GGGG {
static volatile int arraySize = 32768;
static volatile int data[];
public static void main(String[] args) {
data = new int[arraySize];
Random rnd = new Random(0);
for (int c = 0; c < arraySize; ++c) {
data[c] = rnd.nextInt() % 256;
}
Arrays.sort(data);
long start = System.nanoTime();
long sum = 0;
for (int i = 0; i < 200000; ++i) {
for (int c = 0; c < arraySize; ++c) {
if (data[c] >= 128) {
sum += data[c];
}
}
}
System.out.println((System.nanoTime() - start) / 1000000000.0);
System.out.println("sum = " + sum);
System.out.println("=========================");
}
And output with volatile is:
6.776267265
sum = 310368400000
=========================
All I was expecting to slow down the process with volatile, but it's working faster. What's happened?