I understand the question may not make sense at first, but I'll explain it here.
First, I have the following C++ code, a simple Hello World.
#include <stdio.h>
#include <stdlib.h>
int main()
{
printf("Hello World ");
return(0);
}
Now I am calling it from within Java using this:
long start = System.nanoTime();
Process p = Runtime.getRuntime().exec("/home/name/./test");
long totalTime = System.nanoTime() - start;
System.out.println("Time: " + totalTime);
After doing this a couple times i get the following output:
Time: 8155128
Time: 732204
Time: 508819
Time: 662987
I wonder if this is a correct way to measure the time for the c++ code execution and i am want to know if there is an explanation on why always the first execution shows a time 10X bigger than the others (even if it's nanoseconds)