-1

After delving into VBA benchmarking (see also) I'm not satisfied those answers go into sufficient detail. From a similar question about timing in Go, I see there is a difference between measuring absolute time and changes in time. For absolute time, a "Wall clock" should be used, which can be synchronised between machines using the Network Time Protocol for example

Meanwhile "Monatonic Clocks" should be used to measure differences in time, as these are not subject to leap seconds or (according to that linked go answer) changes in frequency of the clock.

Have I got that right? Is there anything else to consider? Assuming those definitions are correct, which category do each of these clocks belong to, or in other words, which of these clocks will give me the most accurate measurement of changes in time:

  • VBA Time
  • VBA Timer
  • WinApi GetTickCount
  • WinApi GetSystemTimePreciseAsFileTime
  • WinApi QueryPerformanceFrequency + QueryPerformanceCounter

Or is it something else?

I may be overlooking other approaches. I say this because some languages like Java get time in nanoseconds rather than microseconds, how is this possible? Surely the Windows Api will tap into the most accurate hardware timer available, which gives microsecond resolution. What's Java doing and can I copy that?

PS, I have no idea how to tag this, please add as you think appropriate

Greedo
  • 4,967
  • 2
  • 30
  • 78
  • For what reason do you need that level of accuracy? – fbueckert May 18 '18 at 16:06
  • @fbueckert I'm trying to make a highly accurate stopwatch using pure VBA, which I haven't stumbled across yet. But I mainly asked this question because I'm interested to learn how these specific timers work/differ and whether they're the best out there (as touted in those linked VBA answers - I don't think they were fully justified) – Greedo May 18 '18 at 16:09
  • I've mashed together a clock in javascript for work before, but that's because of a specific requirement. I don't think VBA is your best bet for an accurate timing app, so if you know Java is more accurate, it would be a better bet to use that. – fbueckert May 18 '18 at 16:13
  • 1
    @fbueckert pretty sure you'd have a much easier time making a COM-visible wrapper over a .NET `StopWatch` than trying to get some Java code to run off a VBA code base IMO. Looks like Greedo is trying to make a reusable component that VBA code can use for measuring time, not a "timing app". – Mathieu Guindon May 18 '18 at 16:16
  • @MathieuGuindon I'm not up on how clocks work. I'm just pointing out that using the right tool for the job is probably going to be much easier in the long run. My experience with VBA is that it's a painful process. And is almost *never* the right tool, if you have other options. – fbueckert May 18 '18 at 16:19
  • @fbueckert I don't think I do know that Java is more accurate. The clock may return a more *precise* answer, but for all I know that precision is a load of meaningless decimal places - the timer is not necessarily accurate to the nanosecond. That's part of what I'm asking - it seems strange that a language should claim to have a timer more accurate than the WinApi, so I wonder whether there's substance behind the claim, and if so, how I can tap into it from VBA (and whether it's the right *sort* of clock to tap into for time differences anyway). I'm writing in VBA as I want to characterise VBA – Greedo May 18 '18 at 16:19
  • https://stackoverflow.com/questions/3744032/why-are-net-timers-limited-to-15-ms-resolution In then end you can't measure more accurately than your hardware allows. – Tim Williams May 18 '18 at 17:53

1 Answers1

0

QueryPerformanceFrequency and QueryPerformanceCounter are going to give you the highest resolution timer. They basically wrap the rtdsc instruction which counts elapsed CPU cycles.

Varrak
  • 708
  • 3
  • 13