0
#include "TIMER1.h"
#include "MAIN.h"  
typedef unsigned _int64 uint64;

void TASK1()
{
uint64 freq, start, end, diff;
//unsigned int milliseconds;

QueryPerformanceFrequency((LARGE_INTEGER*)&freq);
QueryPerformanceCounter((LARGE_INTEGER*)&start);

// code to measure
printf("hi\n");
printf("hi1\n");
printf("hi2\n");

QueryPerformanceCounter((LARGE_INTEGER*)&end);
diff = (((end - start) * 1000) / freq);
//milliseconds = (unsigned int)(diff & 0xffffffff);
printf("It took %u ms\n",diff);

}

I am calling the void TASK1() function multiple times from the main and analysing the time taken to print as in the above code (i.e hi, hi1, hi2). I am calculating the time difference between the start time and end time taken to print the hi, hi1, hi2. My question : why am I getting delay in the output and not able to print exactly as expected.

error in the output : hi hi1 hi2 it took 0ms

hi hi1 hi2 it took 1ms

sometimes in the output: hi1 hi2 it took 2ms

what is the reason for that ?? how to change the above code from milliseconds to microseconds ??

user2984410
  • 39
  • 2
  • 7
  • Difficult to say, however the problem with QueryPerformanceCounter is that its accuracy can be subject to degradation on modern processors that use power saving features. Anyway there is an interesting post here: http://stackoverflow.com/questions/7287663/queryperformancecounter-status it will may help. Being at user level you have also to consider the latency between the moment you receive the time and the moment in which that's reported by your app. – Jekyll Nov 12 '13 at 21:18
  • There's no maximum jitter, it all depends on what background tasks your operating system is running at the time. And do you really need help converting milliseconds to microseconds? You already convert from seconds to milliseconds. – Mark Ransom Nov 12 '13 at 21:23
  • is it the right way to convert to microseconds diff = (((end - start) * 1000000) / freq); ?? – user2984410 Nov 12 '13 at 21:24
  • thank you very much for the reply : Mark Ransom, Jekyll – user2984410 Nov 12 '13 at 21:25
  • If I convert into microseconds as above then there is a more jitter in the output. I am using windows operating system. – user2984410 Nov 12 '13 at 21:26
  • There isn't really more jitter, you're just seeing it to a higher level of precision. – Mark Ransom Nov 12 '13 at 21:29
  • Why on Earth would you not expect jitter? Code execution is in general highly irregular. But the time needed by printf() is especially unpredictable, there's process interop underneath that greatly depends on the state of the process that owns the console window. Get it to have the scroll the window and you see a completely different outcome. – Hans Passant Nov 12 '13 at 21:29
  • @ Mark Ransom : I am looking for a high level precision. If i convert it to microsecond then getting a more jitter. – user2984410 Nov 12 '13 at 21:32
  • @Hans Passant : Thank you for the reply. instead of printing , if I perform some task like to send data then will it be a less jitter ?? – user2984410 Nov 12 '13 at 21:33
  • I'm not going to make any predictions, no idea what's running on that machine. Don't ask me, just try it. – Hans Passant Nov 12 '13 at 21:40

1 Answers1

0
  1. Varying delays. ... what is the reason for that ?

    printf() synchronises with other processes and may therefore vary in required time.

  2. ... how to change the above code from milliseconds to microseconds ?

    diff = (((end - start) * 1000000) / freq);
    printf("It took %u us\n",diff);
    
Arno
  • 4,994
  • 3
  • 39
  • 63