Since I wanted to check several methods (foo(int)
) for efficiency and compare them to the overall runtime I ended up writing something similar to:
void testNanoTime(){
long methodDuration = 0;
long testStart = System.nanoTime();
for(int i= 0; i< 300; i++){
long startJob = System.nanoTime();
foo(i); //checking how foo does with certain data.
methodDuration+= System.nanoTime()-startJob;
bar();
}
System.out.println("MethodDuration: "+methodDuration);
System.out.println("TestDuration: "+(System.nanoTime()-testStart));
}
foo(int)
sometimes needs several minutes up to half an hour (but not in sum 293 years!) per call. The issue is now, that sometimes the TestDuration (the whole time the method takes) is smaller than the methodDuration, which seems impossible to me. Hence two questions:
- To which amount are my older tests comparing methodDurations still valid?
- What should I use for further testing of performance without danger of getting invalid timeStamps? Would be
System.currentTimeMillis()
safe to go or has it the same issues?
Currently the testing takes place at a linux system. I already found some older questions and answers regarding this issue here (e.g. Is System.nanoTime() completely useless?), but some state it is a Windows problem and other answers remain unclear as they mention, they might be outdated (and they are even several years old). Even most answers contradict each other!