1

My application receives instructions from my client.

On each instruction I want to execute code from the received timestamp + 10 seconds.

To test the accuracy of my code I mocked up this simple console test:

    static DateTime d1 = DateTime.Now;
    static Timer _timer = null;
    static void Main(string[] args)
    {
        _timer = new Timer(Callback, null,200, Timeout.Infinite);
       d1 = DateTime.Now;
       Console.ReadLine();
    }

    private static void Callback(Object state)
    {
        TimeSpan s = DateTime.Now - d1;
        Console.WriteLine("Done, took {0} secs.", s.Milliseconds);
        _timer.Change(200, Timeout.Infinite);
    }

On running it I had hoped to display:

Done, took 200 milsecs.
Done, took 200 milsecs.
Done, took 200 milsecs.
Done, took 200 milsecs.
Done, took 200 milsecs.
Done, took 200 milsecs.

But it does not.

I get random values for 'milsecs'.

NB My actual code will not always be using 200 milliseconds.

Example Screenshot:

enter image description here

Andrew Simpson
  • 6,883
  • 11
  • 79
  • 179
  • 1
    How random of values are you getting? Do you mean it is of by 5 milliseconds or are you getting some that are off by 500 milliseconds? Please clarify. – Jacob Lambert May 25 '15 at 16:31
  • 1
    There are many timers, which Timer class are you using? Even if you use the System.Threading.Timer though, thread switching means that there's no guarantee that will execute exactly every 200ms - that's multimedia playback accuracy. – Panagiotis Kanavos May 25 '15 at 16:31
  • You should be using the [StopWatch](http://stackoverflow.com/questions/457605/how-to-measure-code-performance-in-net) class, not DateTime. – Erik Philips May 25 '15 at 16:31
  • @ErikPhilips Hi, thanks. I will substitute that now and report back – Andrew Simpson May 25 '15 at 16:34
  • @PanagiotisKanavos Thanks for the comment, I am using the Threading Timer. – Andrew Simpson May 25 '15 at 16:35
  • @PanagiotisKanavos You are right.do you know fo a technique that would give me multimedia accuracy? – Andrew Simpson May 25 '15 at 16:52

2 Answers2

3

First, you're using the TimeSpan.Milliseconds property, which is an int and only returns the Millisecond part of the timespan. To make sure you're getting the full time, use TotalMilliseconds

Second, you're not resetting d1 with each loop, so you'll see the total time, not the time per iteration. To fix this, make sure to set d1 = DateTime.Now at the end of your callback so the next loop is using the right time.

Finally, you may see a certain discrepancy rather than perfect 200ms intervals, due to the overhead of threading, method calls, and even the subtraction operation you're doing. In addition, DateTime does not always have the greatest accuracy - according to this article from Eric Lippert:

If the question you want to ask is about how long some operation took, and you want a high-precision, high-accuracy answer, then use the StopWatch class.

Thus:

static Stopwatch _watch = Stopwatch.StartNew();
static Timer _timer = null;
static void Main(string[] args)
{
    _timer = new Timer(Callback, null,200, Timeout.Infinite);
   Console.ReadLine();
}

private static void Callback(Object state)
{
    TimeSpan s = _watch.Elapsed;
    _watch.Restart();
    Console.WriteLine("Done, took {0} secs.", s.TotalMilliseconds);
    _timer.Change(200, Timeout.Infinite);
}
David
  • 10,458
  • 1
  • 28
  • 40
1

s.Milliseconds is the milliseonds after the decimal point so to speak. You need TotalMilliseconds. Those properties are horribly named. They are the cause for many SO questions.

The built-in timers should have about 15ms resolution (or a small multiple of it) and potentially some skew (I don't know).

usr
  • 168,620
  • 35
  • 240
  • 369