public class Runtime {
public static void main(String[] args) {
int[] n = {1,100,1000,10000};
for (int i=0; i<4; i++) {
StringRepeater s = new StringRepeater();
long start = System.nanoTime();
s.repeatString("hello", n[i]);
long stop = System.nanoTime();
long runtime = stop - start;
System.out.println("T(" + n[i] + ") = " + runtime/1000000000.0 + " seconds");
}
for (int i=0; i<4; i++) {
long start = 0;
long stop = 0;
long runtime100 = 0;
for (int j=0; j<100; j++) {
StringRepeater s = new StringRepeater();
start = System.nanoTime();
s.repeatString("hello", n[i]);
stop = System.nanoTime();
runtime100 = runtime100 + (stop - start);
}
System.out.println("T(" + n[i] + ") = " + runtime100/100000000000.0 + " seconds");
}
}
}
So i've got this code which measures the runtime of repeatString
public class StringRepeater {
public String repeatString(String s, int n){
String result = "";
for(int i=0; i<n; i++) {
result = result + s;
}
return result;
}
}
The top part with 1 for loop calculates runtime of 1 run. The bottom part with 2 for loop calculates it based on average of 100. But for some reason the runtime of second part is averagely so much faster especially for lower n. For n=1 its even 100 times faster.
T(1) = 2.3405E-5 seconds
T(100) = 1.47748E-4 seconds
T(1000) = 0.00358515 seconds
T(10000) = 0.173254266 seconds
T(1) = 1.9015E-7 seconds
T(100) = 3.035997E-5 seconds
T(1000) = 0.00168481277 seconds
T(10000) = 0.10354477848 seconds
This is about the typical return. Is my code wrong or is there something else going on. TL:DL why is average runtime so much lower than 1x runtime? You would expect it to be fairly similar right?