I'd add the following regarding MusiGenesis Answer for the re-sync timing.
Meaning: What time should I use to re-sync ( the _maxIdle
in MusiGenesis answer's)
You know that with this solution you are not perfectly accurate, thats why you re-sync.
But also what you implicitly want is the same thing as Ian Mercer solution's:
a unique timestamp that is allocated in strict ascending order
Therefore the amount of time between two re-sync ( _maxIdle
Lets call it SyncTime) should be function of 4 things:
- the
DateTime.UtcNow
resolution
- the ratio of accuracy you want
- the precision level you want
- An estimation of the out-of-sync ratio of your machine
Obviously the first constraint on this variable would be :
out-of-sync ratio <= accuracy ratio
For example : I dont want my accuracy to be worst than 0.5s/hrs or 1ms/day etc... (in bad English: I dont want to be more wrong than 0.5s/hrs=12s/day).
So you cannot achieve a better accuracy than what the Stopwatch offer you on your PC. It depends on your out-of-sync ratio, which might not be constant.
Another constraint is the minimum time between two resync:
Synctime >= DateTime.UtcNow
resolution
Here accuracy and precision are linked because if you using a high precision (for example to store in a DB) but a lower accuracy, You might break Ian Mercer statement that is the strict ascending order.
Note: It seems DateTime.UtcNow may have a bigger default Res than 15ms (1ms on my machine) Follow the link:
High accuracy DateTime.UtcNow
Let's take an example:
Imagine the out-of-sync ratio commented above.
After about 10 hours, it was ahead by 5 seconds.
Say I want a microsec precision. My timer res is 1ms (see above Note)
So point by point:
- the
DateTime.UtcNow
resolution : 1ms
- accuracy ratio >= out-of-sync ratio,
lets take the most accurate possible so : accuracy ratio = out-of-sync ratio
- the precision level you want : 1 microsec
- An estimation of the out-of-sync ratio of your machine : 0.5s/hour (this is also my accuracy)
If you reset every 10s, imagine your at 9.999s, 1ms before reset.
Here you make a call during this interval. The time your function will plot is ahead by : 0.5/3600*9.999s eq 1.39ms.
You would display a time of 10.000390sec. After UtcNow tick, if you make a call within the 390micro sec, your will have a number inferior to the previous one. Its worse if this out-of-sync ratio is random depending on CPU Load or other things.
Now let's say I put SyncTime at its minimum value > I resync every 1ms
Doing the same thinking would put me Ahead of time by 0.139 microsec < inferior to the precision I want. Therefore if I call the function at 9.999 ms, so 1microsec before reset I will plot 9.999. And just after I will plot 10.000. I will have a good order.
So here the other constraint is : accuracy-ratio x SyncTime < precision level , lets say to be sure because number can be rounded up that accuracy-ratio x SyncTime < precision level/2 is good.
The issue is resolved.
So a Quick recap would be :
- Retrieve your timer resolution.
- Compute an estimate of the out-of-sync ratio.
- accuracy ratio >= out-of-sync ratio estimate , Best accuracy = out-of-sync ratio
- Choose your Precision Level considering the following:
- timer-resolution <= SyncTime <= PrecisionLevel/(2*accuracy-ratio)
- The best Precision you can achieve is timer-resolution*2*out-of-sync ratio
For the above ratio (0.5/hr) the correct SyncTime would be 3.6ms, so rounded down to 3ms.
With the above ratio and the timer resolution of 1ms. If you want a one-tick Precision level (0.1microsec), you need an out-of-sync ratio of no more than : 180ms/hour.
In its last answer to its own answer MusiGenesis state:
@Hermann: I've been running a similar test for the last two hours (without the reset correction), and the Stopwatch-based timer is only running about 400 ms ahead after 2 hours, so the skew itself appears to be variable (but still pretty severe). I'm pretty surprised that the skew is this bad; I guess this is why Stopwatch is in System.Diagnostics. – MusiGenesis
So the Stopwatch accuracy is close to 200ms/hour, almost our 180ms/hour. Is there any link to why our number and this number are so close ? Dont know. But this accuracy is enough for us to achieve Tick-Precision
The Best PrecisionLevel: For the example above it is 0.27 microsec.
However what happen if I call it multiple times between 9.999ms and the re-sync.
2 calls to the function could end-up with the same TimeStamp being returned the time would be 9.999 for both (as I dont see more precision). To circumvent this, you cannot touch the precision level because it is Linked to SyncTime by the above relation. So you should implement Ian Mercer solution's for those case.
Please don't hesitate to comment my answer.