2

I've started the a desktop app development in C# (VS2010 .Net Fw 4.0) involving several timers.

At first, I was using System.Timers in order to send data by USB to a data bus. My point is that I need to send different periodic binary messages with several specific intervals of time such as 10ms, 20ms, 40ms, 100ms, 500ms, 1000ms ....

In each timer I have a callback function who has to send the correct data flow for each raised event.

The problem is when I have the full amount of signals working, the real interval is different than expected (3000ms raises to 3500, 100ms to 340) because when I add the 10ms to 40ms timers, the cpu is overload almost 100% and loses all the precission.

I'm working in VMWare with 2GB RAM and 2CPU cores. The Host machine is an i7-2600K CPU @ 3,40GHz. I think this is no the problem.(but I think).

Before I wrote here, i was looking for an answer about how to be more exact with timing, using the most optimized resource consumption. but all I found it was unespecific.

I already knows about System.Diagnostics.Stopwatch reading about it here but this class have not events. Also System.Windows.Forms.Timer but is more imprecise, and with low resolution.

There is a good implementation here with microseconds resolution but it is the reason of my overloading CPU!

What do you think about my further steps!? I'll appreciate any help or idea you have

10% timing resolution is the goal in 10ms' interval!

I will clarify any extra info you need!

  • 7
    The desktop version of Windows is not a real-time operating system. If you have strict real-time requirements then use a real-time operating system; otherwise you will be fighting the assumptions of the operating system. You should consider purchasing specialized hardware if you have strict real-time requirements. – Eric Lippert Jan 09 '14 at 17:45
  • 2
    http://stackoverflow.com/q/2989350/62576 – Ken White Jan 09 '14 at 17:47
  • 1
    You can't. The resolution will never be that accurate in an application in Windows / not at a hardware / driver level. – Bob G Jan 09 '14 at 17:48
  • 1
    To add to the confusion [Precision is not the same as accuracy](http://blogs.msdn.com/b/oldnewthing/archive/2005/09/02/459952.aspx) – rene Jan 09 '14 at 18:03
  • I found [here](http://www.codeproject.com/Articles/17474/Timer-surprises-and-how-to-avoid-them) a good application in c# to test your SO capacity for timing. In my Virtual Machine the **Native Multimedia Timers** really has an average interval of 1msec. – Andre_Saffin Jan 14 '14 at 14:45

0 Answers0