3

I am using a timer with interval 1 second. But in the timer's tick event when I print the time it's always 62 or 65 ms. I don't understand why it's taking 10 ms more.

Please can some one have look into this.

Here is the code I am using:

static int _counter;
var _timer = new System.Timers.Timer(1000);
public Form1()
{
    InitializeComponent();           
    _timer.Elapsed += new ElapsedEventHandler(_timer_Elapsed);
    _timer.Start();            
}

private void _timer_Elapsed(object sender, ElapsedEventArgs e)
{
    Console.WriteLine(DateTime.Now.ToString("{hh:mm:ss.fff}"));          
    _counter++;
    if (_counter == 20)
        _timer.Stop();
}

And this the output:

{01:59:08.381}
{01:59:09.393}
{01:59:10.407}
{01:59:11.421}
{01:59:12.435}
{01:59:13.449}
{01:59:14.463}
{01:59:15.477}
{01:59:16.491}
{01:59:17.505}
{01:59:18.519}
{01:59:19.533}
{01:59:20.547}
{01:59:21.561}
{01:59:22.575}
{01:59:23.589}
{01:59:24.603}
{01:59:25.615}
{01:59:26.629}
{01:59:27.643}
abatishchev
  • 98,240
  • 88
  • 296
  • 433
Embedd_0913
  • 16,125
  • 37
  • 97
  • 135
  • 2
    Your problem description and example are inconsistent. the example is set for 1 sec intervals and appears to be showing 1 sec intervals, not 50mSec – simon Jun 07 '10 at 12:25

11 Answers11

15

You need to understand that Windows is not a real-time operating system. Real-time operating systems have timer mechanisms that allow the system to make hard guarantees about when timer-initiated events occur and the overhead associated with them, and allow you to specify what behavior should occur when the deadline is missed -- for example if the previous execution took longer than the interval.

I would characterize the Windows timers as "best effort" when it comes to smaller intervals. When the interval is sufficiently long you don't notice that you aren't getting the exact interval that you requested. As you get closer and closer to the resolution of the timer (the frequency at which the timer runs), you start seeing the overhead as a percentage of the interval increase. Real-time systems take special care to minimize the software overhead, relying on more sophisticated and faster hardware solutions. The exact frequency of the Windows timer depends on the timing services that the underlying hardware provides and so may differ from system to system.

If you have real-time needs -- and doing something every 50ms may fall into that category -- then you may need to look at specialized hardware and/or a real-time OS.

tvanfosson
  • 524,688
  • 99
  • 697
  • 795
  • 1
    Sir, you are not aware of the specific Timer that the user asked for. This Timer is not precise by default and has nothing to do with the OS. There are other more precise Timers in Windows (e.g. multimedia timers). – Pavlos Fragkiadoulakis Jul 16 '19 at 22:54
  • @PavlosFragkiadoulakis lol - don't mistake the granularity of a timer with it's predictability. If you need strict real-time guarantees, Windows is not the OS you should choose. While it might be adequate for playing music, I seriously doubt that you'd run an aircraft on it. – tvanfosson Jul 17 '19 at 13:41
7

It's because of the limited resolution of the system clock. The event occurs at the next system tick after the specififed time, so you will always get a few extra milliseconds.

Guffa
  • 687,336
  • 108
  • 737
  • 1,005
5

If you need a more precise timer, you can hook into the Win32 Multimedia Timer, it is the most accurate timer (down to 1ms). Here's an article on CodeProject showing how to hook into it from C#.

JohnForDummies
  • 950
  • 4
  • 6
4

First, as other people have noted, you're setting it to 1s, not 50ms.

Secondly, windows is not a real-time OS. None of the timer classes are exactly precise. All you're doing it saying that you want to wait at least this long. It takes some amount of time for everything to fire and you to end up notified that the timer has ticked once windows gets around to actually servicing the tick message.

Donnie
  • 45,732
  • 10
  • 64
  • 86
  • You can make Windows behave better by using the performance counter in a thread that is bound to a single CPU core to calibrate the high-resolution timer. I think. I've seen it done but didn't write the code concerned, some remarkably low-level techniques are required (i.e., possibly direct machine code!) and there are evil firmware bugs in this area. Almost better to just stop worrying about clock accuracy IMO… – Donal Fellows Jun 07 '10 at 12:50
  • Timing accuracy matters an extreme amount when dealing with A/D peripherals, sensory equipment, and medical devices. It is very, very important. But, as @Donnie pointed out, Windows is *NOT* a real-time OS. Try WinCE or Windows Embedded if you *must* use a windows OS. – Stephen Furlani Jun 07 '10 at 13:13
2

Note that usually, in most language, sleep calls specify the minimum time after which a process would awaken. After the specified time has passed, the process is put on a queue and hopefully the scheduler activates it. But this activation may sometimes be delayed. I'm not sure about the Timer class, but I suspect it may suffer from a similar problem.
You may perhaps try to increase the priority of your process to cut down the increased time.

apoorv020
  • 5,420
  • 11
  • 40
  • 63
2

System.Timers.Timer is not a precise timer. Especially when system is under load it can have even bigger delays.

Also to get better accuracy in your example change time measuring code to use Stopwatch class.

static int _counter;
        System.Timers.Timer _timer = new System.Timers.Timer(1000);
        Stopwatch sw;
        public Form1()
        {
            InitializeComponent();           
            _timer.Elapsed += new ElapsedEventHandler(_timer_Elapsed);
            _timer.Start();            
            sw = Stopwatch.StartNew();
        }

        void _timer_Elapsed(object sender, ElapsedEventArgs e)
        {
            Console.WriteLine(sw.ElapsedMilliseconds);          
            _counter++;
            if (_counter == 20)
                _timer.Stop();            

            sw.Reset();
            sw.Start();
        }
Vadym Stetsiak
  • 1,974
  • 18
  • 22
1

Using the system timers will always be a little longer than the value requested. This is due to the overhead of the other processes in the system.

simon
  • 5,777
  • 7
  • 30
  • 36
1

On my system it's 14ms. Having googled; the difference is down to context thread switching delay. There's an article regarding high resolution timers here

John Warlow
  • 2,922
  • 1
  • 34
  • 49
1

As other responders have mentioned, Windows is not a real-time OS. If you must use windows, try using Win CE or Windows Embedded.

-S!

Stephen Furlani
  • 6,794
  • 4
  • 31
  • 60
1

The accuracy of the time may depend on how many processes run. If you have that option , I would reduce the number of processes that run on your computer one by one and I mean those which consume significant cpu time,I would check if the times improve. Especially, browsers, virus scanners,programs running in the background.

Aftershock
  • 5,205
  • 4
  • 51
  • 64
0

The deviations are normal since they are not RTOS (real time operating systems). This is the best solution that I've found under the circumstances: Link

Program.MicroTimer microTimer = new Program.MicroTimer();
microTimer.MicroTimerElapsed += new Program.MicroTimer.MicroTimerElapsedEventHandler(OnTimedEvent);
microTimer.Interval = 1000; // Call micro timer every 1000µs (1ms)

// Can choose to ignore event if late by Xµs (by default will try to catch up)
// microTimer.IgnoreEventIfLateBy = 500; // 500µs (0.5ms)

microTimer.Enabled = true; // Start timer
System.Threading.Thread.Sleep(2000);
microTimer.Enabled = false;

Those are the code snippets. You can try them to see the values in the console.

BurhanBoz
  • 1
  • 2