0

My problem:

  1. Receiving broadcast from server.

  2. Receiving broadcast from my intermediate layer.

Here I am receiving two broadcasts in a server UDP application within the same millisecond. How do I find the difference between them to microseconds precision?

How can I measure the time at a microseconds precision level?

rustyx
  • 80,671
  • 25
  • 200
  • 267

2 Answers2

1

Use QueryPerformanceCounter. It's machine dependent, but most processors run very fast these days, and you'll get better than microsecond accuracy.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
noelicus
  • 14,468
  • 3
  • 92
  • 111
  • A C++11 wrapper for that is `std::chrono::high_resolution_clock`. Its resolution is usually 100 nanoseconds. – rustyx Aug 04 '16 at 19:39
0

In the 1990s I wrote I program running on MS-DOS which used a product called Personal Computer High Resolution Timing (PCHRT) by a company called Ryle Design, in Mt. Pleasant, Michigan.

Their product provided a library of routines that provided micro-second resolution. It worked beautifully, by interfacing directly with the 8253 or 8254 timing chip found in PCs back in those days.

If you can track down Ryle Design, they may have found a way to provide high resolution timing under Microsoft Windows.

When Windows replaced MS-DOS I had to abandon that product and moved my product into custom hardware using a microcontroller to achieve high-resolution timing. I couldn't find a way to get high-resolution timing under Microsoft Windows.

This article, although many years old, may help:

http://msdn.microsoft.com/en-us/magazine/cc163996.aspx

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Mike Jablonski
  • 1,703
  • 6
  • 27
  • 41