10

Thread.Sleep() resolution varies from 1 to 15.6ms

Given this console app:

class Program
{
    static void Main()
    {
        int outer = 100;
        int inner = 100;
        Stopwatch sw = new Stopwatch();

        for (int j = 0; j < outer; j++)
        {
            int i;
            sw.Restart();
            for (i = 0; i < inner; i++)
                Thread.Sleep(1);
            sw.Stop();
            Console.WriteLine(sw.ElapsedMilliseconds);
        }
    }
}

I expected the output to be 100 numbers close to 100. Instead, I get something like this:

99 99 99 100 99 99 99 106 106 99 99 99 100 100 99 99 99 99 101 99 99 99 99 99 101 99 99 99 99 101 99 99 99 100 99 99 99 99 99 103 99 99 99 99 100 99 99 99 99 813 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1559 1560 1559 1559 1559 1559 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1558 1559 1558 1558 1558 1558 1558

But sometimes I won't get any accurate results; it will be ~1559 every time.

Why isn't it consistent?

Some googling taught me that 15.6ms is the length of a timeslice, so that explains the ~1559 results. But why is it that I sometimes get the right result, and other times I just get a multiple of 15.6? (e.g. Thread.Sleep(20) will usually give ~31.2ms)

How is it affected by hardware or software?

I ask this because of what led me to discover it:

I had been developing my application on a 32 bit dual-core machine. Today my machine was upgraded to 64bit quad-core with a complete OS re-install. (Windows 7 and .NET 4 in both cases, however I can't be sure the older machine had W7 SP1; the new one does.)

Upon running my application on the new machine, I immediatey notice my forms take longer to fade out. I have a custom method to fade my forms which uses Thread.Sleep() with values varying from 10 to 50. On the old system this seemed to work perfectly every time. On the new system it's taking much longer to fade than it should.

Why did this bahavior change between my old system and my new system? Is this related to hardware, or software?

Can I make it consistently accurate? (~1ms resolution)

Is there something I can do in my program to make Thread.Sleep() reliably accurate to about 1ms? Or even 10ms?

Igby Largeman
  • 16,495
  • 3
  • 60
  • 86
  • A real program shouldn't Sleep() anyway. But the standard Timers have the same resolution. – H H Sep 30 '11 at 19:18
  • I liked the question the way I wrote it. Please refrain from making edits which don't obviously correct a mistake. – Igby Largeman Sep 30 '11 at 19:45
  • Making readers press unnecessary PageDowns for totally meaningless data is a mistake. – H H Sep 30 '11 at 19:54
  • @Henk: my original post is _shorter_ than your edit. Your edit shortened someone else's edit which was what made it unnecessarily long. And the data isn't meaningless. – Igby Largeman Sep 30 '11 at 20:01
  • Just out of curiosity, does the decrease in resolution -- the point where you stop seeing it sleep for 100ms, and start seeing it pause for 1500ms instead -- correspond to any kind of external event? Say, your application losing focus? – Joe White Sep 30 '11 at 21:40
  • @Joe: no, nothing that I could detect. Apart from background processes my machine was idle. – Igby Largeman Oct 01 '11 at 00:25
  • 2
    Maybe some other process, or the OS itself, is frequently doing `timeBeginPeriod(1)`/`timeEndPeriod(1)`. – Igby Largeman Oct 01 '11 at 00:37

5 Answers5

7

The answer is not to use Thread.Sleep and instead use a high resolution timer. You'll need to do your fade in a busy loop, but it sounds like that would be no problem. You simply cannot expect high resolution from Thread.Sleep and it is notorious for behaving differently on different hardware.

You can use the Stopwatch class on .net which uses high-resolution performance counters if they are supported on the hardware.

David Heffernan
  • 601,492
  • 42
  • 1,072
  • 1,490
  • 2
    I went with this solution. I got millisecond accuracy by sitting in a loop while checking StopWatch.MillisecondsElapsed. As you said, a busy loop is fine in this case. – Igby Largeman Oct 01 '11 at 00:22
  • But I still wish someone could tell me what's happening - why it switches between 1ms and 15ms resolution seemingly at random. – Igby Largeman Oct 01 '11 at 00:30
  • My only guess is that some other process, or the OS itself, is doing `timeBeginPeriod(1)`/`timeEndPeriod(1)`. – Igby Largeman Oct 01 '11 at 00:36
7

“The Sleep function suspends the execution of the current thread for at least the specified interval.”

-> http://social.msdn.microsoft.com/Forums/en/clr/thread/facc2b57-9a27-4049-bb32-ef093fbf4c29

Tobias
  • 9,170
  • 3
  • 24
  • 30
3

I can answer one of my questions: Can I make it consistently accurate? (~1ms resolution)

Yes, it seems that I can, using timeBeginPeriod() and timeEndPeriod().

I've tested this and it works.

Some things I've read suggest that calling timeBeginPeriod(1) for the duration of the application is a bad idea. However, calling it at the start of a short method and then clearing it with timeEndPeriod() at the end of the method should be okay.

Nevertheless I will also investigate using timers.

Igby Largeman
  • 16,495
  • 3
  • 60
  • 86
3

You have a program running on your machine that is calling timeBeginPeriod() and timeEndPeriod(). Typically a media related program that uses timeSetEvent() to set a one millisecond timer. It affects the resolution of Sleep() as well.

You could pinvoke these functions yourself to get consistent behavior. Not terribly reasonable for UI effects though. It is rather unfriendly to battery life on a laptop.

Sleeping for 20 msec and actually getting 2/64 seconds is otherwise logical, the cpu simply won't wake up soon enough to notice that 20 msec have passed. You only get multiples of 1/64 seconds. So a reasonable choice for a Timer that implements fading effects is 15 msec, giving you 64 fps animation worst case. Assuming that your effect draws fast enough. You'll be a bit off if timeBeginPeriod was called but not by much. Calculating the animation stage from the clock works too but is a bit overkill in my book.

Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • That's true, 16ms is enough resolution for that particular situation. The dramatic increase happened because the method worked by looping 100 times, changing opacity from 1 to 0 in 1% increments. So 100 * 3 = 0.3 seconds became 100 * 15.6 = 1.6 seconds. So I can either use timeBeginPeriod/timeEndPeriod or I can rework the loop so that it only loops (fade milliseconds / 16) times and changes opacity in increments of (1 / number of loops). I've tried both methods and they both work well. (Or I can use a high resolution timer) – Igby Largeman Sep 30 '11 at 21:44
  • 1
    Well, as I pointed out, don't sweat the small stuff. But don't use 16, that's too long and will get you either 2/64 or 0.016 seconds. Use 15, gets you either 1/64 or 0.015 seconds. – Hans Passant Sep 30 '11 at 21:55
  • Noted, thanks. In the end I used a StopWatch and simply looped while checking MillisecondsElapsed; this method is accurate to the millisecond. Eating the cycles is okay in this case. – Igby Largeman Oct 01 '11 at 00:19
  • That's a bad idea. Your animation will run a lot smoother when you sleep or use a timer. When you burn cycles like you do, the thread scheduler will put you in the dog house for a while after you burned through the quantum to give other threads in other processes a chance to run. – Hans Passant Jan 23 '13 at 03:03
  • Well geez Hans, you coulda told me sooner! I don't work for that company, or even live in that country anymore! It really worked fine though. I was just fading out a form over about half a second. – Igby Largeman Jan 23 '13 at 21:50
  • Well geez, you said you didn't care so I didn't either. Other users are voting for this answer, they ought to know what the problem with your shortcut is. – Hans Passant Jan 23 '13 at 22:01
-2

Your approach is wrong. You will never get sleep to be accurate, and the more busy your machine is, the more wrong your sleep loop will become.

What you should be doing is looking at how much time has passed and adapting accordingly. Although spinning around Sleep is a bad idea, "better" ways of doing it will be significantly more complex. I'll keep my suggested solution simple.

  • Call DateTime.Now.Ticks and save it in a variable (startTick).
  • In your loop, call Sleep as you are already doing.
  • Call DateTime.Now.Ticks again and subtract startTick from it - this value will be how many 100 nanosecond units of time have passed since you started the fade (timeSinceStart).
  • Using timeSinceStart calculate how much you should be faded with that much time elapsed.
  • Repeat until you have completely faded it.
Doug65536
  • 208
  • 3
  • 3
  • If I call Sleep in the loop as I am already doing, how would this make any difference? – Igby Largeman Sep 30 '11 at 20:06
  • -1 DateTime Ticks has similar problems as Now is tied to system clock resolution e.g 1\64 second. – Tim Lloyd Sep 30 '11 at 21:09
  • You use DateTime.Now.Ticks to track how much time Sleep ACTUALLY slept. He is fading something, he won't be needing super high precision timing. His problem is essentially that sleep might sleep longer than requested - my solution is to calculate how long it actually slept so it behaves properly when the system is under load. Why does it matter that he can't sleep for extremely precise period of time - accept the reality that you might not have 100% control over the cpu and sleep won't necessarily sleep the right amount. My algorithm fades correctly based on elapsed time. – Doug65536 Sep 30 '11 at 21:38
  • The same pattern applies for everything involving timing - calculating how much time ACTUALLY passed. Say you were going to update a KB/sec display every second, would you expect your update code to be called perfectly every second? I guarantee it won't. You try to update it every second, and your algorithm calculates how much time has ACTUALLY passed and how much bytes have been transferred when calculating the KB/sec. – Doug65536 Sep 30 '11 at 21:43
  • You state that Ticks represents how many 100 nanosecond intervals have passed when queried, it doesn't. – Tim Lloyd Sep 30 '11 at 21:45
  • And by the way, calling timeBeginPeriod to try to hack Sleep to be accurate is a very very bad idea. You are essentially saying you should bog down the entire operating system and burn out people's laptop batteries faster in a futile attempt to make Sleep sleep an accurate amount of time when it is specifically documented to often sleep longer than requested. When the system is under load, Sleep is guaranteed to sleep a LOT longer than requested, timeBeginPeriod or not. – Doug65536 Sep 30 '11 at 21:47
  • @chibacity It does if you subtract the timestamp of when the fade started from the value you got just now. Straight from the MSDN docs: A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond. – Doug65536 Sep 30 '11 at 21:49
  • @Doug Try writing a loop that queries Now.Ticks, you will see it only updates every 1\64 second. – Tim Lloyd Sep 30 '11 at 21:52
  • The definition of "elapsed time" is how much time has passed since some reference point. If you save the tick count at the start of the operation, then subtracting the starting tick count from the current tick count will result in the number of ticks since that point. – Doug65536 Sep 30 '11 at 21:52
  • What are you arguing? That he needs to be more accurate than 1/64th of a second? He's **fading** something. 1/64th is more than accurate enough. The screen will only be updating around 60 Hz, you saying he needs to update the video memory more often than a monitor refreshes? – Doug65536 Sep 30 '11 at 21:53
  • But it is not going to be in 1\64 second intervals is it? It'll be all over the place. – Tim Lloyd Sep 30 '11 at 21:59
  • @chibacity You're completely missing the point. I don't care if ticks is accurate to 1/64th of a second, or accurate to 1 picosecond. The point I am trying to get across is, you sleep so you don't waste CPU and you calculate how much ACTUAL time has elapsed every update. You can use whatever you like to get the elapsed time. I prefer Ticks (FILETIME units) because it is simply a number which you can use to calculate and crossing midnight or crossing into another day or month or year are irrelevant. – Doug65536 Sep 30 '11 at 22:02
  • I'm not missing any point, the OP is looking for something to generate regular "events" so that he can perform smooth animation. If anything use a StopWatch for Ticks *not* DateTime.Now. – Tim Lloyd Sep 30 '11 at 22:05
  • Everyone is fighting over how to get it SUPER accurate, and nobody realizes that it is not possible for it to be super accurate, and it is foolish to try. Is it a game engine running at 60fps on a GPU? No, it's boggy windows GUI code. The accuracy of the timing is absolutely irrelevant, getting it to fade correctly no matter the load on the system or speed of the system is what is important. What if it was DirectX code - then you'd still have the same problem, what if its a laptop GPU and the frame rate is slower - you need to fade accurately over time at 60 fps or at 10 fps. – Doug65536 Sep 30 '11 at 22:07
  • A stopwatch is the EXACT SAME THING as DateTime.Now. What member of the stopwatch is he going to use to get the elapsed time? ElapsedTicks if he's smart - all the other members are calculated off of ticks anyway, except they do floating point junk to convert it to those units. – Doug65536 Sep 30 '11 at 22:12
  • 1
    You are very wrong. Use a StopWatch. StopWatch is not the same as DateTime.Now. StopWatches are high resolution and represent real Ticks, DateTime.Now Ticks are low resolution and only updated every 1/64 seconds. There is endless information about this if you look for it. – Tim Lloyd Sep 30 '11 at 22:14
  • In the end I used this inside my fade loop: `sw.Reset(); while (sw.ElapsedMilliseconds < ms) ;` where `sw` is a StopWatch and `ms` is the number of milliseconds I want to wait for. This method is accurate to the millisecond. In this case it doesn't matter that I'm eating the cycles. – Igby Largeman Oct 01 '11 at 00:11
  • @chibacity: I don't need events, a simple wait is fine. I don't need to yield because 1) it's never more than half a second (total) and 2) the application doesn't need to be responsive when this happens anyway. – Igby Largeman Oct 01 '11 at 00:16
  • @Charles Sure, I enclosed "events" in quotes as I was speaking figuratively i.e. conceptual events, rather than literal events. StopWatch is the right thing to use. Burning round in a tight loop is a bit of a sledge-hammer, but if it works for you... – Tim Lloyd Oct 01 '11 at 11:27