I have written a simple code:
int temp = 10;
long startTime = System.currentTimeMillis();
for (int i = 0; i < temp; i++) {
for (int j = 0; j < temp; j++) {
for (int k = 0; k < temp; k++) {
System.out.println(i + " " + j + " " + k);
}
}
}
long stopTime = System.currentTimeMillis();
System.out.println("Total Time: " + (stopTime - startTime));
The total time for this program is around 50ms.
However, when I changed temp to 100, I tough that total time would estimate around 500ms (because, we increased temp 10 times 50*10=500), however on the other hand I though that because this is a cubic function (the time complexity is O(n^3)) the result shouldn't be exactly 500, it should have been more, because as the temp increases the total time would increase according to its time complexity. After the experiment, I run the program and the result was 3234ms (around 3000 always).
So, could some one explain to me how does time execution change according to it's time complexity? For example if we have 10 items the time execution is 50ms and for 100 items time execution is 3000ms, could some one explain why does this happen in more precise way?