I'm trying to calculate the amount of time a single iteration takes in a for loop I've written that writes lines to a file in Java. After looking around, it seems like one method that people use to do this is by grabbing the System time at the beginning and end of the execution and subtracting those results. Here's what I have in my code:
String content;
for (int i = 0 ; i < 100; i += 1 ) {
long startTime = System.nanoTime();
content = "Currently writing line " + i " to the file.";
myBufferedWriter.write(content);
long endTime = System.nanoTime();
System.out.println("This iteration took: " + (endTime - startTime) " time.");
}
However, looking at my results I get mostly something like this:
This iteration took: 1000 time
This iteration took: 1000 time
This iteration took: 0 time
This iteration took: 1000 time
...
This iteration took: 24000 time
This iteration took: 1000 time
This iteration took: 0 time
This iteration took: 17000 time
So on and so forth. Essentially most of the iterations look like they execute with a similar speed, but why exactly is there that crazy fluctuation in both ends of the spectrum at times? Sometimes it ends up going really quickly (no difference in time?) and sometimes it takes a lot longer. What gives?