0

I'm relatively new to Java programming, and I'm running into an issue calculating the amount of time it takes for a function to run.

First some background - I've got a lot of experience with Python, and I'm trying to recreate the functionality of the Jupyter Notebook/Lab %%timeit function, if you're familiar with that. Here's a pic of it in action (sorry, not enough karma to embed yet):

Snip of Jupyter %%timeit

What it does is run the contents of the cell (in this case a recursive function) either 1k, 10k, or 100k times, and give you the average run time of the function, and the standard deviation.

My first implementation (using the same recursive function) used System.nanoTime():

public static void main(String[] args) {

    long t1, t2, diff;
    long[] times = new long[1000];
    int t;
    for (int i=0; i< 1000; i++) {

        t1 = System.nanoTime();
        t = triangle(20);
        t2 = System.nanoTime();

        diff = t2-t1;
        System.out.println(diff);
        times[i] = diff;
    }

    long total = 0;
    for (int j=0; j<times.length; j++) {
        total += times[j];
    }
    System.out.println("Mean = " + total/1000.0);
}

But the mean is wildly thrown off -- for some reason, the first iteration of the function (on many runs) takes upwards of a million nanoseconds:

Pic of initial terminal output

Every iteration after the first dozen or so takes either 395 nanos or 0 -- so there could be a problem there too... not sure what's going on!

Also -- the code of the recursive function I'm timing:

static int triangle(int n) {
    if (n == 1) {
        return n;
    } else {
        return n + triangle(n -1);
    }
}

Initially I had the line n = Math.abs(n) on the first line of the function, but then I removed it because... meh. I'm the only one using this.

I tried a number of different suggestions brought up in this SO post, but they each have their own problems... which I can go into if you need.

Anyway, thank you in advance for your help and expertise!

Ole V.V.
  • 81,772
  • 15
  • 137
  • 161
Luke
  • 26
  • 6
  • 1
    JIT. Warm up. Benchmarking is hard. System clocks don't really do nano second resolution. What platform is this anyway? Which version of Java, which JRE, which operating system, what results do you expect? – Elliott Frisch Nov 14 '18 at 01:15
  • Using JDK10 + the JRE that came with it -- running on Windows 10. Is there a better timer to use? I tried the `Instant` and `Duration` ones mentioned in that other post, same problem there too. Edit: does the JVM come with its own timer? – Luke Nov 14 '18 at 01:19
  • Like [python's `timeit`](https://docs.python.org/2/library/timeit.html) (which you attributed to Jupyter)? No. But there is [`StopWatch`](https://commons.apache.org/proper/commons-lang/apidocs/org/apache/commons/lang3/time/StopWatch.html) in [Apache Commons Lang](https://commons.apache.org/proper/commons-lang/). – Elliott Frisch Nov 14 '18 at 01:28
  • I should have said -- does the JVM come with its own internal clock that I can reach out to, if the system clock is a bad way to go about this. I was referring to the Jupyter magic command, which uses the [python `timeit`](https://img.memecdn.com/you-got-me_o_1891513.jpg) function, yes. – Luke Nov 14 '18 at 01:38
  • `System.nanoTime` is the right method to use. You get nothing preciser. Related: [How do I write a correct micro-benchmark in Java?](https://stackoverflow.com/questions/504103/how-do-i-write-a-correct-micro-benchmark-in-java) – Ole V.V. Nov 14 '18 at 07:21
  • Awesome -- thank you! – Luke Nov 14 '18 at 14:34

0 Answers0