#include "TIMER1.h"
#include "MAIN.h"
typedef unsigned _int64 uint64;
void TASK1()
{
uint64 freq, start, end, diff;
//unsigned int milliseconds;
QueryPerformanceFrequency((LARGE_INTEGER*)&freq);
QueryPerformanceCounter((LARGE_INTEGER*)&start);
// code to measure
printf("hi\n");
printf("hi1\n");
printf("hi2\n");
QueryPerformanceCounter((LARGE_INTEGER*)&end);
diff = (((end - start) * 1000) / freq);
//milliseconds = (unsigned int)(diff & 0xffffffff);
printf("It took %u ms\n",diff);
}
I am calling the void TASK1() function multiple times from the main and analysing the time taken to print as in the above code (i.e hi, hi1, hi2). I am calculating the time difference between the start time and end time taken to print the hi, hi1, hi2. My question : why am I getting delay in the output and not able to print exactly as expected.
error in the output : hi hi1 hi2 it took 0ms
hi hi1 hi2 it took 1ms
sometimes in the output: hi1 hi2 it took 2ms
what is the reason for that ?? how to change the above code from milliseconds to microseconds ??