I was experimenting to find how different the amount of time calculations took in Java and Python; I came up with an arbitrary expression to be evaluated in both languages and then took the amount of time that elapsed during the calculation and found that the time for Java was 1454584 ns while for Python, it was either 0 or 1000 ns (I'm guessing anything less than 1000 was rounded down to 0 or too small to be detected). Is Java really this slow to do relatively basic mathematical operations, or is there some difference in the way time is measured in these languages, or am I misunderstanding something about how calculations are performed? I am finding it hard to believe that there is that large of a time difference between the two, but if it is really just because of the languages, then why?
Here is my Java code, which gave the output for time elapsed of 1454584 ns in console. The calculation in result is just a random arbitrary calculation I came up with on the spot.
double result;
long start = System.nanoTime();
result = 10 + 5 - Math.pow(9, 3) * 10 / 5 + 3 - 3 - 2 - 100 * 2 + 9000 / 50 + 0.2 * 0.9 / 3 + 1 / Math.pow(3, 2) * 6;
long time = System.nanoTime() - start;
System.out.println("Result: " + result);
System.out.println("Time elapsed: " + time + " ns.");
Here is my Python code, which gave a result of either 0 ns or 1000 ns over multiple trials.
start = time.time_ns()
result = 10 + 5 - 9 ** 3 * 10 / 5 + 3 - 3 - 2 - 100 * 2 + 9000 / 50 + 0.2 * 0.9 / 3 + 1 / 3 ** 2 * 6
finish = time.time_ns()
print("Result: ",result)
print("Time elapsed:",finish - start,"ns.")