I would like to measure how much time it takes to execute a piece of code. What would be the most effective and correct way to do this.
I wrote a code that looks like the one below and the results are not the same every time. There is a level of randomization occurring and I do not know why and how to remove this effect.
#include<stdio.h>
#include<time.h>
int main()
{
int x=10;
int y=25;
int z=x+y;
printf("Sum of x+y = %i", z);
time_t start = clock();
for(int i=0;i<100000;i++)z=x+y;
time_t stop = clock();
printf("\n\nArithmetic instructions take: %d",stop-start);
start = clock();
for(int i=0;i<100000;i++)z=x&y;
stop = clock();
printf("\n\nLogic instructions take: %d",stop-start);
}
The results look like this:
Arithmetic instructions take: 327
Logic instructions take: 360
Arithmetic instructions take: 271
Logic instructions take: 271
Arithmetic instructions take: 287
Logic instructions take: 294
Arithmetic instructions take: 279
Logic instructions take: 266
Arithmetic instructions take: 265
Logic instructions take: 296
What other ways are to measure the time it takes to execute the loops.
NOTICE: The loops are NOT removed by the compiler optimization, I checked it.
So, what is the correct way to benchmark a piece of code?