0

I was experimenting to find how different the amount of time calculations took in Java and Python; I came up with an arbitrary expression to be evaluated in both languages and then took the amount of time that elapsed during the calculation and found that the time for Java was 1454584 ns while for Python, it was either 0 or 1000 ns (I'm guessing anything less than 1000 was rounded down to 0 or too small to be detected). Is Java really this slow to do relatively basic mathematical operations, or is there some difference in the way time is measured in these languages, or am I misunderstanding something about how calculations are performed? I am finding it hard to believe that there is that large of a time difference between the two, but if it is really just because of the languages, then why?

Here is my Java code, which gave the output for time elapsed of 1454584 ns in console. The calculation in result is just a random arbitrary calculation I came up with on the spot.

double result;
long start = System.nanoTime();
result = 10 + 5 - Math.pow(9, 3) * 10 / 5 + 3 - 3 - 2 - 100 * 2 + 9000 / 50 + 0.2 * 0.9 / 3 + 1 / Math.pow(3, 2) * 6;
long time = System.nanoTime() - start;
System.out.println("Result: " + result);
System.out.println("Time elapsed: " + time + " ns.");

Here is my Python code, which gave a result of either 0 ns or 1000 ns over multiple trials.

start = time.time_ns()
result = 10 + 5 - 9 ** 3 * 10 / 5 + 3 - 3 - 2 - 100 * 2 + 9000 / 50 + 0.2 * 0.9 / 3 + 1 / 3 ** 2 * 6
finish = time.time_ns()
print("Result: ",result)
print("Time elapsed:",finish - start,"ns.")
starball
  • 20,030
  • 7
  • 43
  • 238
  • 3
    You should probably at _least_ run the calculation in a loop for some large number of times. Also, are you sure that the operator precedence is the same in both languages? Maybe you should add brackets to rule out possible differences if you aren't sure. – starball Jan 02 '23 at 04:58
  • Note that it's not just Java that requires work to build an accurate microbenchmark -- so does Python. Use the `timeit` module, don't just compare `time.time_ns()` calls. – Charles Duffy Jan 02 '23 at 05:01
  • Also, "0 ns or 1000 ns" is telling you pretty clearly that the Python runtime in question (on whichever specific platform you're using for testing) doesn't actually have a nanosecond-accurate timer. That's another part of why good benchmarking implementations run something a given number of times (thousands? millions? whatever's appropriate) and then divide to get an average, in addition to doing things like turning off garbage collection, making sure any relevant JIT compilation process is warmed before timing starts, etc etc. – Charles Duffy Jan 02 '23 at 05:03
  • 1
    The most likely explanation is that the (Java) measurement is distorted by JVM startup / warmup effects. Please read [How do I write a correct micro-benchmark in Java?](https://stackoverflow.com/questions/504103). Then redo your benchmarking. If you still get a huge discrepancy between Python and Java, post the *complete* benchmark codes so that people can attempt to investigate it. – Stephen C Jan 02 '23 at 05:07
  • (in general, Java is slow to start, but much faster than Python when warmed up and running; see benchmarks @ https://julialang.org/benchmarks/ comparing those among many other languages for common math operations; notice the log scale -- a dot a full line above the same dot in another column reflects a 10x difference in performance. In those benchmarks, Java doing math in simple/idiomatic code is not more than 10x slower than C; Python doing math in simple/idiomatic code is sometimes very nearly 100x slower than C). – Charles Duffy Jan 02 '23 at 05:09
  • 1
    I also think there might be a difference between `Math.pow(9,3)` and `9**3`. The Java method call requires a conversion to `double` and the Python version could be unrolled by the compiler as `(9*9*9)` (You could do the same thing in Java manually). – markspace Jan 02 '23 at 05:52
  • For that matter, the Java compiler could just be precalculating _the whole thing_ and not doing any math at runtime at all. No variables unknown at compile time == no reason to defer the work until runtime. – Charles Duffy Jan 02 '23 at 15:59

0 Answers0