2

I wrote a simple program that uses a timer to fade out a label. I found that the timer is so imprecise and the interval is shorter and shorter.

The following code proves my thought:
(I also implement it by StopWatch. The interval between ticks is almost equal.)


private void timer1_Tick(object sender, EventArgs e)
{
    elapsed += timer1.Interval;
    timerTest.AppendText(DateTime.UtcNow.Second.ToString() + "." + DateTime.UtcNow.Millisecond.ToString() + "\r\n");
    if (elapsed >= target)
        timer1.Stop();
}
int elapsed = 0; int target = 4000;
private void button_Click(object sender, EventArgs e) { labelFadeout.ForeColor = Color.Greed; elapsed = 0; timer1.Interval = 100; timer1.Tick += timer1_Tick; timer1.Start(); }

And the 1st and 5th output result in a large difference!


[1st]   [5th]
20.318  42.955
20.377  42.956
20.491  42.957
20.595  42.958
20.707  42.959
20.814  43.68
20.929  43.69
21.34   43.7
21.142  43.71
21.257  43.72
21.365  43.173
21.471  43.176
21.584  43.177
21.692  43.179
21.8    43.18
21.909  43.286
22.19   43.288
22.127  43.289
22.242  43.291
22.347  43.293
22.454  43.397
22.569  43.4
22.673  43.402
22.784  43.404
22.892  43.406
23.4    43.649
23.112  43.652
23.221  43.655
23.331  43.657
23.494  43.66
23.549  43.746
23.662  43.749
23.779  43.751
23.879  43.754
23.991  43.756
24.95   43.846
24.209  43.848
24.316  43.851
24.427  43.853
24.534  43.855

I've checked

and still curious about how it could be so imprecise. Anyone could explain this??

Kordan Ou
  • 1,502
  • 3
  • 15
  • 25
  • I believe that Timer is pretty inaccurate when measuring time between two consecutive ticks, however its accuracy will not degrade over time. In other words, 100th tick will have less % error, but can have bigger absolute error than 2nd tick. Or rephrasing it even more, starting/stopping timer will create a huge addictive error over a time, while starting it once and waiting for 100th tick is pretty accurate. – Sinatr Oct 11 '13 at 13:13
  • 1
    have you read the page you linked http://msdn.microsoft.com/en-us/library/xy0zeach%28v=vs.80%29.aspx. To quote "The interval is not guaranteed to elapse exactly on time. To ensure accuracy, the timer should check the system clock as needed" – Jodrell Oct 11 '13 at 13:17
  • Anyway, just because the timer raises an event, you can't garauntee when the event will be processed and consequently when your various calls to `UtcNow` will be made. – Jodrell Oct 11 '13 at 13:23
  • See [this](http://stackoverflow.com/q/11531128/1504523) SO question. – Arno Oct 11 '13 at 13:34
  • @Sinatr I have the same opinion with you, and that is also why I ask this question and print the system clock time. :p – Kordan Ou Oct 11 '13 at 17:12
  • @Jodress Yes, I know there is a latency between DataTime.UtcNow calls. +-5% error is tolerable, but 4216ms and 900ms has a HUGE difference. – Kordan Ou Oct 11 '13 at 17:15
  • @Amo Great post and answer! If my five results are ALL not accurate, it is acceptable. Unfortunately, it's not. Might the ticked callback is optimized by .Net runtime? (Nonsenses though, in this case) – Kordan Ou Oct 11 '13 at 17:24

1 Answers1

3

Your method for testing your interval is off. You're really just getting a part of the current time at an interval and expecting it to be comparable to the last time you took it.

If you want to test how much time a timer really takes to tick at a 100 interval. Use an actual Stopwatch, and check out it's Elapsed to find out how much time as passed.

Stopwatch stopwatch = new Stopwatch();
int elapsed = 0;
int target = 4000;

private void timer1_Tick(object sender, EventArgs e)
{
    elapsed += timer1.Interval;
    timerTest.AppendText(stopwatch.Elapsed.TotalSeconds.ToString());
    stopwatch.Restart();

    if (elapsed >= target)
        timer1.Stop();
}

private void button1_Click(object sender, EventArgs e)
{
    elapsed = 0;
    timer1.Interval = 100;
    timer1.Tick += timer1_Tick;
    stopwatch.Start();
    timer1.Start();
}

My result:

0.1055445
0.0872668
0.1121169
0.1032453
0.1066107
0.1097218
0.103818
0.1079014
...
Khan
  • 17,904
  • 5
  • 47
  • 59
  • Thanks, but what I really care is how long it takes in total 40 ticks(4000/100). As I said, "(I also implement it by StopWatch. The interval between ticks is almost equal.)". 1st takes 4216ms but 5th takes only 900ms. The error rate changes from +0.054% to -77.5%. – Kordan Ou Oct 11 '13 at 17:07
  • The 10-th result by your code: 2.9E-06 1.9E-06 2.4E-06 2.4E-06 2.4E-06 0.0002083 2.9E-06 2.9E-06 2.4E-06 3.4E-06 2.9E-06 2.4E-06 2.4E-06 2.9E-06 1.9E-06 2.4E-06 2.4E-06 4.8E-06 2.9E-06 2.9E-06 1.9E-06 2.9E-06 3.4E-06 – Kordan Ou Oct 14 '13 at 02:49