2

I have a program that reads the current time from the system clock and saves it to a text file. I previously used the GetSystemTime function which worked, but the times weren't completely consistent eg: one of the times is 32567.789 and the next time is 32567.780 which is backwards in time.

I am using this program to save the time up to 10 times a second. I read that the GetSystemTimeAsFileTime function is more accurate. My question is, how to I convert my current code to use the GetSystemTimeAsFileTime function? I tried to use the FileTimeToSystemTime function but that had the same problems.

SYSTEMTIME st;
GetSystemTime(&st);

WORD sec = (st.wHour*3600) + (st.wMinute*60) + st.wSecond; //convert to seconds in a day
lStr.Format( _T("%d   %d.%d\n"),GetFrames() ,sec, st.wMilliseconds);

std::wfstream myfile;  
myfile.open("time.txt", std::ios::out | std::ios::in | std::ios::app );
if (myfile.is_open())
    {
     myfile.write((LPCTSTR)lStr, lStr.GetLength());
     myfile.close();
    }
else {lStr.Format( _T("open file failed: %d"), WSAGetLastError());
}           

EDIT To add some more info, the code captures an image from a camera which runs 10 times every second and saves the time the image was taken into a text file. When I subtract the 1st entry of the text file from the second and so on eg: entry 2-1 3-2 4-3 etc I get this graph, where the x axis is the number of entries and the y axis is the subtracted values.

enter image description here

All of them should be around the 0.12 mark which most of them are. However you can see that a lot of them vary and some even go negative. This isn't due to the camera because the camera has its own internal clock and that has no variations. It has something to do with capturing the system time. What I want is the most accurate method to extract the system time with the highest resolution and as little noise as possible.

Edit 2 I have taken on board your suggestions and ran the program again. This is the result:

enter image description here

As you can see it is a lot better than before but it is still not right. I find it strange that it seems to do it very incrementally. I also just plotted the times and this is the result, where x is the entry and y is the time:

enter image description here

Does anyone have any idea on what could be causing the time to go out every 30 frames or so?

oodan123
  • 459
  • 2
  • 8
  • 23
  • Where did you read that they're more accurate? I would expect them to be the same (although the `FILETIME` version would probably be quicker/more efficient as it doesn't have to do the day/month/year calculations). – Jonathan Potter Sep 09 '15 at 03:36
  • [Arno's comments in this thread](http://stackoverflow.com/questions/3162826/fastest-timing-resolution-system?rq=1) – oodan123 Sep 09 '15 at 03:42
  • It just isn't nearly precise enough for your needs. Use GetSystemTimePreciseAsFileTime() on Win8 and higher, timeBeginPeriod(1) on older versions. – Hans Passant Sep 09 '15 at 04:41
  • yeah i'm on windows 7 and I put timeBeginPeriod(1) and timeEndPeriod around the block of code and it didn't seem to make a difference – oodan123 Sep 09 '15 at 05:22
  • I have added another edit – oodan123 Sep 09 '15 at 05:39
  • It helps to break down exactly what properties you want from the timer. I think that the most important criterium for you is that the _relative_ values (`time2 - time1`) are accurate, even if the two times are obtained on different CPU cores. The absolute time relative to UTC is totally irrelevant to you, and a nanosecond _resolution_ is equally irrelevant. (Nanosecond precision in no way implies nanosecond accuracy) – MSalters Sep 09 '15 at 09:32

3 Answers3

6

First of all, you wanna get the FILETIME as follows

FILETIME fileTime;
GetSystemTimeAsFileTime(&fileTime);
// Or for higher precision, use
// GetSystemTimePreciseAsFileTime(&fileTime);

According to FILETIME's documentation,

It is not recommended that you add and subtract values from the FILETIME structure to obtain relative times. Instead, you should copy the low- and high-order parts of the file time to a ULARGE_INTEGER structure, perform 64-bit arithmetic on the QuadPart member, and copy the LowPart and HighPart members into the FILETIME structure.

So, what you should be doing next are

ULARGE_INTEGER theTime;
theTime.LowPart = fileTime.dwLowDateTime;
theTime.HighPart = fileTime.dwHighDateTime;

__int64 fileTime64Bit = theTime.QuadPart;

And that's it. The fileTime64Bit variable now contains the time you're looking for.

If you want to get a SYSTEMTIME object instead, you could just do the following:

SYSTEMTIME systemTime;
FileTimeToSystemTime(&fileTime, &systemTime);
  • Thanks for the answer. it does get the system time but there are variations in when it gets the time. I have added an edit to my question. – oodan123 Sep 09 '15 at 04:23
  • What if you try to change your code a little bit such that it opens the file once in the beginning and closes it at the very end. I'm curious whether the opening/closing file for append operation is causing the overhead or something else is causing it. –  Sep 09 '15 at 04:33
  • Thanks, I shall try that and let you know – oodan123 Sep 09 '15 at 04:35
  • Yeah that made no difference unfortunately – oodan123 Sep 09 '15 at 05:21
4

Getting the system time out of Windows with decent accuracy is something that I've had fun with, too... I discovered that Javascript code running on Chrome seemed to produce more consistent timer results than I could with C++ code, so I went looking in the Chrome source. An interesting place to start is the comments at the top of time_win.cc in the Chrome source. The links given there to a Mozilla bug and a Dr. Dobb's article are also very interesting.

Based on the Mozilla and Chrome sources, and the above links, the code I generated for my own use is here. As you can see, it's a lot of code!

The basic idea is that getting the absolute current time is quite expensive. Windows does provide a high resolution timer that's cheap to access, but that only gives you a relative, not absolute time. What my code does is split the problem up into two parts:

1) Get the system time accurately. This is in CalibrateNow(). The basic technique is to call timeBeginPeriod(1) to get accurate times, then call GetSystemTimeAsFileTime() until the result changes, which means that the timeBeginPeriod() call has had an effect. This gives us an accurate system time, but is quite an expensive operation (and the timeBeginPeriod() call can affect other processes) so we don't want to do it each time we want a time. The code also calls QueryPerformanceCounter() to get the current high resolution timer value.

bool NeedCalibration = true;
LONGLONG CalibrationFreq = 0;
LONGLONG CalibrationCountBase = 0;
ULONGLONG CalibrationTimeBase = 0;

void CalibrateNow(void)
{
  // If the timer frequency is not known, try to get it
  if (CalibrationFreq == 0)
  {
    LARGE_INTEGER freq;
    if (::QueryPerformanceFrequency(&freq) == 0)
      CalibrationFreq = -1;
    else
      CalibrationFreq = freq.QuadPart;
  }

  if (CalibrationFreq > 0)
  {
    // Get the current system time, accurate to ~1ms
    FILETIME ft1, ft2;
    ::timeBeginPeriod(1);
    ::GetSystemTimeAsFileTime(&ft1);
    do
    {
      // Loop until the value changes, so that the timeBeginPeriod() call has had an effect
      ::GetSystemTimeAsFileTime(&ft2);
    }
    while (FileTimeToValue(ft1) == FileTimeToValue(ft2));
    ::timeEndPeriod(1);

    // Get the current timer value
    LARGE_INTEGER counter;
    ::QueryPerformanceCounter(&counter);

    // Save calibration values
    CalibrationCountBase = counter.QuadPart;
    CalibrationTimeBase = FileTimeToValue(ft2);
    NeedCalibration = false;
  }
}

2) When we want the current time, get the high resolution timer by calling QueryPerformanceCounter(), and use the change in that timer since the last CalibrateNow() call to work out an accurate "now". This is in Now() in my code. This also periodcally calls CalibrateNow() to ensure that the system time doesn't go backwards, or drift out.

FILETIME GetNow(void)
{
  for (int i = 0; i < 4; i++)
  {
    // Calibrate if needed, and give up if this fails
    if (NeedCalibration)
      CalibrateNow();
    if (NeedCalibration)
      break;

    // Get the current timer value and use it to compute now
    FILETIME ft;
    ::GetSystemTimeAsFileTime(&ft);
    LARGE_INTEGER counter;
    ::QueryPerformanceCounter(&counter);
    LONGLONG elapsed = ((counter.QuadPart - CalibrationCountBase) * 10000000) / CalibrationFreq;
    ULONGLONG now = CalibrationTimeBase + elapsed;

    // Don't let time go back
    static ULONGLONG lastNow = 0;
    now = max(now,lastNow);
    lastNow = now;

    // Check for clock skew
    if (LONGABS(FileTimeToValue(ft) - now) > 2 * GetTimeIncrement())
    {
      NeedCalibration = true;
      lastNow = 0;
    }

    if (!NeedCalibration)
      return ValueToFileTime(now);
  }

  // Calibration has failed to stabilize, so just use the system time
  FILETIME ft;
  ::GetSystemTimeAsFileTime(&ft);
  return ft;
}

It's all a bit hairy but works better than I had hoped. This also seems to work well as far back on Windows as I have tested (which was Windows XP).

DavidK
  • 3,929
  • 1
  • 19
  • 26
  • awesome answer! So if I use `FileTimeToSystemTime` on the now() filetime that was calculated will I keep that accuracy or will it go back to a relative time? – oodan123 Sep 11 '15 at 00:23
  • I had a go at trying to implement it on my program and I couldn't seem to improve the time – oodan123 Sep 11 '15 at 02:01
  • Hmmm, hard to know where the problem lies. If you tried posting a complete example with the attempt at implementing the above it would be possible to have a play with it. – DavidK Sep 11 '15 at 09:30
  • 1
    The more I think about this, the more it seems to me that you need to be sure that your trigger is as regular as you think (that is, whatever gives you the X axis values, presumably something to do with your camera. I would try just recording the high resolution timer (from QueryPerformanceCounter()) and graphing that. Although the timer is just a relative number, not an absolute time, it should give you an idea of whether the variations you see really come from problems determining the current time, or a lack of regularity in whatever triggers the code each time. – DavidK Sep 11 '15 at 19:43
1

I believe you are looking for GetSystemTimePreciseAsFileTime() function or even QueryPerformanceCounter() - to be short for something that is guarantied to produce monotone values.

c-smile
  • 26,734
  • 7
  • 59
  • 86
  • Thanks, do you have an example? I seem to be having trouble replacing it with GetSystemTimeAsFileTime, i'm on windows 7 – oodan123 Sep 09 '15 at 04:40
  • @oodan123, See my answer. I've included an example. You could uncomment the code and comment the `GetSystemTimeAsFileTime()` function instead –  Sep 09 '15 at 04:43
  • I can't use the function because it is only on windows 8 or higher I tried timeBeginPeriod(1) but that didn't do anything either – oodan123 Sep 09 '15 at 05:23