8

I am implementing a communication algorithm to send information periodically and very fast, i.e. 1ms between packs. I've a functional version that uses Tasks to send the packs. Here an example of my code:

private void Work()
{
    Stopwatch stopwatch = new Stopwatch();
    stopwatch.Start();

    while (!cancellationTokenSource.Token.IsCancellationRequested)
    {
        if (!Pack.PeriodicOn)
            cancellationTokenSource.Cancel();

        // Time used to send the packs before the interval time
        double tolerance = Pack.Interval * 0.2F;

        // In case of interval bigger than 25ms send pasks only 5ms before
        if (tolerance > 5) tolerance = 5;

        TimeSpan timeSpan = stopwatch.Elapsed;

        // The waiting time is controlled by the condition below, if the condition is false, the while loop continues execution         
        // Send the information a little bit before to interval time to deal with the transmision delay
        if (Pack.LastSent.TotalMilliseconds == 0 ||
             timeSpan.TotalMilliseconds - Pack.LastSent.TotalMilliseconds >=
             (Pack.Interval - tolerance))
        {
            SendData(Pack);
            Pack.LastSent = timeSpan;
        }
    }

    Pack.LastSent = new TimeSpan(0);
}

My problem relies in the fact that the CPU usage increases to an undesirable levels. I know that I can avoid that by introducing some delay, but, Thread.Sleep(1) is very inaccurate and the real transmission interval between packs rises, if I use await Task.Delay(1) seems to produce the same effect.

Does anybody have an alternative way to introduce, accurately, delay in tasks?

Thanks in advance!

YRod
  • 91
  • 4
  • 7
    Windows isn't a real-time OS :/ – Matías Fidemraizer Dec 21 '16 at 22:18
  • Other SO answers, e.g. http://stackoverflow.com/questions/6254703/thread-sleep-for-less-than-1-millisecond seem to say spinning the CPU, as you're doing, is the way to go. – Scott Mermelstein Dec 21 '16 at 22:21
  • Understanding what you mean by "very inaccurate" would be helpful. How much variance are you seeing? What's your system's timer interval (the default is 15ms) – EricLaw Dec 21 '16 at 22:21
  • You may wait the appropriate sleep delay by inserting "Thread.Sleep(Pack.Interval-(timeSpan.TotalMilliseconds - Pack.LastSent.TotalMilliseconds))" as the last instruction of the while loop. – Graffito Dec 21 '16 at 22:35
  • I'm trying to have as small variance as I can do... pretty sure that Thread.Slep(x) wont solve the problem, even when x greater than 15 it cannot accomplish the delay with small variance, that's why I'm asking :) – YRod Dec 21 '16 at 22:41
  • It should work if Pack.Interval>2. Execute "Sleep(x)" only if x>=1. – Graffito Dec 21 '16 at 22:53
  • What is the CPU usage for this? – Mike Zboray Dec 22 '16 at 02:12
  • If I start 4 independent tasks like the one in the example, the CPU usage can reach 80% – YRod Dec 22 '16 at 14:07

3 Answers3

2

How to introduce an accurate small [1ms] delay without CPU overload [on Windows]?

You can't, sorry. The system scheduler on windows is only slightly adjustable (by selecting Adjust for best performance of Applications in the advanced system properties dialog for Windows Server or setting a registry value), but it won't go into sub-millisecond territory. If it did, the performance of the entire system would suffer unacceptably.

Depending on your hardware, I think it might be possible to reduce the system clock resolution as low as 0.5ms; however, the minimum thread quantum you can set is 6, which would require two clock ticks to reduce to 0. So you'd still end up with a quantum of 1ms, which is at least twice as slow as you'd need. And, of course, you'd reduce your battery life by ~15% (from what I've read).

For more information, read Windows Internals.

Stephen Cleary
  • 437,863
  • 77
  • 675
  • 810
1

Windows is not real-time OS so timers are not guaranteed to work precisely. A typical system clock has about 15 ms accuracy. However it is possible to get more accurate events than the standard System.Threading.Timer. The Windows API has timers designed for multimedia scenarios that can fire at more precise intervals. I have update the code in a GitHub repo I maintain, HighPrecisionTimer, that leverages that API, to include a Task based MultimediaTimer.Delay method:

private static async Task RunEveryMillisecond(CancellationToken token)
{
    Stopwatch s = Stopwatch.StartNew();
    TimeSpan prevValue = TimeSpan.Zero;
    int i = 0;
    while (true)
    {
        Console.WriteLine(s.ElapsedMilliseconds);
        await MultimediaTimer.Delay(1, token);
        if (Console.KeyAvailable)
        {
            return;
        }

        i++;
    }
}

Note that this fires about every 1.5 ms on my system and it can still hit CPU for about 10% when running so it's not a negligible hit on system resources. The standard timer method included in the project are a bit more accurate and efficient (less CPU overhead, ~1%) for running methods on the 1 ms level. My guess is that there is more allocation and garbage collection overhead in the Task-based Delay method.

Note that using this API can have side-effects like shorter battery life. However it is useful for test type scenarios where shorter timings are desired.

Mike Zboray
  • 39,828
  • 3
  • 90
  • 122
  • Thank you very much!, the CPU overhead introduced (~20% for the whole application running the communication) is nothing comparing the one I had before (~80% and higher), I can deal with that. In the case of battery life I must test the performance but it seems to me that I wont have problem since the communication periods are not too long and I stop the timer when idle, Thanks! ;) – YRod Dec 22 '16 at 16:55
0

you can start many consecutive threads say n (n=10 or 20 etc.)(n is the number of threads) after 1ms delay and in each task you can wait for n milliseconds.

ALK007
  • 23
  • 7