1

After reading question Why is processing a sorted array faster than an unsorted array? We had tried to make variables as volatile (I expected that, when I use volatile it must be working slower, but it's working faster) Here is my code without volatile: (It is working about 11 sec.)

import java.util.Arrays;
import java.util.Random;

public class GGGG {

public static void main(String[] args) {
    int arraySize = 32768;
    int data[];
    data = new int[arraySize];

    Random rnd = new Random(0);
    for (int c = 0; c < arraySize; ++c) {
        data[c] = rnd.nextInt() % 256;
    }

    Arrays.sort(data);

    long start = System.nanoTime();
    long sum = 0;

    for (int i = 0; i < 200000; ++i) {
        for (int c = 0; c < arraySize; ++c) {
            if (data[c] >= 128) {
                sum += data[c];
            }
        }
    }

    System.out.println((System.nanoTime() - start) / 1000000000.0);
    System.out.println("sum = " + sum);

    System.out.println("=========================");
}

And output is:

10.876173341
sum = 310368400000
=========================



And this is when I use arraySize and data variables as volatile, and it is working about 7 seconds:

import java.util.Arrays;
import java.util.Random;

public class GGGG {

static volatile int arraySize = 32768;
static volatile int data[];

public static void main(String[] args) {
    data = new int[arraySize];

    Random rnd = new Random(0);
    for (int c = 0; c < arraySize; ++c) {
        data[c] = rnd.nextInt() % 256;
    }

    Arrays.sort(data);

    long start = System.nanoTime();
    long sum = 0;

    for (int i = 0; i < 200000; ++i) {
        for (int c = 0; c < arraySize; ++c) {
            if (data[c] >= 128) {
                sum += data[c];
            }
        }
    }

    System.out.println((System.nanoTime() - start) / 1000000000.0);
    System.out.println("sum = " + sum);

    System.out.println("=========================");
}

And output with volatile is:

6.776267265
sum = 310368400000
=========================

All I was expecting to slow down the process with volatile, but it's working faster. What's happened?

Community
  • 1
  • 1
RustamIS
  • 697
  • 8
  • 24
  • 1
    possible duplicate of http://stackoverflow.com/questions/1090311/are-volatile-variable-reads-as-fast-as-normal-reads – Mik378 Feb 06 '14 at 09:22
  • 3
    With just `static` on the variables I got 4.5 as the output time, with `static volatile` I get 11.7. Also, if I make the variables neither `static` or `volatile` and just put them in the method, it's 4.6 – David Feb 06 '14 at 09:25
  • You are declaring once your variables as `static` fields and once as local variable of the method `main`. Declaring a big array as a local variable of a method may imply an overhead which would explain why volatile is going faster in your case. Compare with static variable both time. – LaurentG Feb 06 '14 at 09:29
  • I just checked it with static (without volatile) and it's giving me 8 seconds. And, What is the difference when I declare inside the method or outside of it (even as a static) - All I'm doing is just calculating the time of the logic :) – RustamIS Feb 06 '14 at 09:32

1 Answers1

8

I'll name just two main issues with your code:

  1. there's no warmup;
  2. everything happens in the main method, therefore JIT-compiled code can be run only by On-Stack Replacement.

Redoing your case with the jmh tool, I get the times just as expected.

@OutputTimeUnit(TimeUnit.MICROSECONDS)
@BenchmarkMode(Mode.AverageTime)
@Warmup(iterations = 3, time = 2)
@Measurement(iterations = 5, time = 3)
@State(Scope.Thread)
@Threads(1)
@Fork(2)
public class Writing
{
  static final int ARRAY_SIZE = 32768;

  int data[] = new int[ARRAY_SIZE];
  volatile int volatileData[] = new int[ARRAY_SIZE];

  @Setup public void setup() {
    Random rnd = new Random(0);
    for (int c = 0; c < ARRAY_SIZE; ++c) {
      data[c] = rnd.nextInt() % 256;
      volatileData[c] = rnd.nextInt() % 256;
    }
    Arrays.sort(data);
    System.arraycopy(data, 0, volatileData, 0, ARRAY_SIZE);
  }

  @GenerateMicroBenchmark
  public long sum() {
    long sum = 0;
    for (int c = 0; c < ARRAY_SIZE; ++c) if (data[c] >= 128) sum += data[c];
    return sum;
  }

  @GenerateMicroBenchmark
  public long volatileSum() {
    long sum = 0;
    for (int c = 0; c < ARRAY_SIZE; ++c) if (volatileData[c] >= 128) sum += volatileData[c];
    return sum;
  }
}

These are the results:

Benchmark       Mode   Samples         Mean   Mean error    Units
sum             avgt        10       21.956        0.221    us/op
volatileSum     avgt        10       40.561        0.264    us/op
Marko Topolnik
  • 195,646
  • 29
  • 319
  • 436