0

The timeGetTime function documentation says:

The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime.

So the precision is system-dependent. But what if I don't want to increase the precision, I just want to know what it is in the current system. Is there a standard way (e.g. an API) to get it? Or should I just poll timeGetTime for a while and look at what comes out and deduce from there?

Joonas Pulakka
  • 36,252
  • 29
  • 106
  • 169

1 Answers1

1

I'd suggest to use the GetSystemTimeAsFileTime function. This function has low overhead and displays ths system clock. See this answer to get some more details about the granularity of time and APIs to query timer resolutions (e.g. NtQueryTimerResolution). Code to find out how the system file time increments can be found there too. Windows 8 and Server 2012 provide the new GetSystemTimePreciseAsFileTime function which is supposed to be more accurate. MSDN states with the highest possible level of precision (<1us). However, this only works on W8 and Server 2012 and there is very little documentation about how this additional accuracy is obtained. Seems like MS is going a Linux alike (gettimeofday) way to combine the performace counter frequency with the system clock. This post may of interest for you too.

Edit: As of February 2014 there is some more detailed information about time matters on MSDN: Acquiring high-resolution time stamps.

Community
  • 1
  • 1
Arno
  • 4,994
  • 3
  • 39
  • 63
  • Thanks for the suggestions. It's amazing how numerous (and often ill-documented!) ways there are to such a basic operation as measuring time... – Joonas Pulakka Mar 28 '13 at 09:37