27

When writing a JavaScript animation, you of course make a loop using setInterval (or repeated setTimeout). But what is the best delay to use in the setInterval/setTimeout call(s)?

In the jQuery API page for the .animate() function, the user "fbogner" says:

Just if anyone is interested: Animations are "rendered" using a setInterval with a time out of 13ms. This is quite fast! Chrome's fastest possible interval is about 10ms. All other browsers "sample" at about 20-30ms.

Any idea how jQuery determined to use this specific number?


Started bounty. I'm hoping someone with knowledge of the source code behind Chromium or Firefox can provide some hard facts that might back up the decision of a specific framerate. Or perhaps a list of animations (or frameworks) and their delays. I believe this makes for an interesting opportunity to do a bit of research.


Interesting - I just took the time to look at Google's Pac-Man source to see what they did. They set up an array of possible FPSes (90, 45, 30), start at the first one, and then each frame they check the "slowness" of the frame (amount the frame exceeded its allotted time). If the slowness exceeds 50ms 20 times, the framerate is notched down to the next in the list (90 -> 45, 45 -> 30). It appears that the framerate is never raised back up, presumably because the game is so short-lived that it wouldn't be worth the trouble to code that.

Oh, and the setInterval delay is of course set to 1000 / framerate. They do, in fact, use setInterval and not repeated setTimeouts.

I think this dynamic framerate feature is pretty neat!

Ricket
  • 33,368
  • 30
  • 112
  • 143

5 Answers5

17

I would venture to say that a substantial fraction of web users are using monitors that refresh at 60Hz, which translates to one frame every 16.66ms. So to make the monitor the bottleneck, you need to produce frames faster than 16.66ms.

There are two reasons you would pick a value like 13ms. First, the browser needs a little bit of time to repaint the screen (in my experience, never less than 1ms). Which puts you at, say, updating every 15ms, which happens to be a very interesting number - the standard timer resolution on Windows is 15ms (see John Resig's blog post). I suspect that an well-written 15ms animation looks very close to the same on a wide variety of browsers/operating systems.

FWIW, fbogner is plain wrong about non-Chrome browsers firing setInterval every 20-30ms. I wrote a test to measure the speed of setInterval firing, and got these numbers:

  • Chrome - 4ms
  • Firefox 3.5 - 15ms
  • IE6 - 15ms
  • IE8 - 15ms
Long Ouyang
  • 593
  • 4
  • 10
6

The pseudo-code for this is this one:

FPS_WANTED = 25 
(just a number, it can be changed while executing, or it can be constant)

TIME_OF_DRAWING = 1000/FPS_WANTED 
(this is in milliseconds, I believe it is accurate enough) 
( should be updated when FPS_WANTED changes)

UntilTheUserLeavesTheDrawingApplication()
{

  time1 = getTime();
  doAnimation();
  time2 = getTime();
  animationTime = time2-time1;

  if (animationTime > TIME_OF_DRAWING)
  {
     [the FPS_WANTED cannot be reached]
     You can:
     1. Decrease the number of FPS to see if a lower framerate can be achieved
     2. Do nothing because you want to get all you can from the CPU
  }
  else
  {
     [the FPS can be reached - you can decide to]
     1. wait(TIME_OF_DRAWING-animationTime) - to keep a constant framerate of FPS_WANTED
     2. increase framerate if you want
     3. Do nothing because you want to get all you can from the CPU
  }

}

Of course there can be variations of this but this is the basic algorithm that is valid in any case of animation.

INS
  • 10,594
  • 7
  • 58
  • 89
3

When doing loops for animations, it's best that you find a balance between the speed of the loop, and how much work needs to be done.

For example, if you want to slide a div across the page within a second so it is a nice effect and timely. You would skip coordinates and have a reasonably fast loop time so the effect is noticeable, but not jumpy.

So it's a trial and error thing (by having to put work, time, and browser capability into account). So it doesn't only look nice on one browser.

Jay
  • 81
  • 1
  • 1
  • 3
2

The number told by fbogner have been tested. The browsers throttle the js-activity to a certain degree to be usable every time.

If your javascript would be possible to run every 5msec the browser runtime would have much less cpu time to refresh the rendering or react on user input (clicks) because javascript-execution blocks the browser.

I think the chrome-devs allow you to run your javascript at much shorter intervals than the other browsers because their V8-Javascript-Engine compiles the JavaScript and therefore it runs faster and the browser will noch be blocked as long as with interpreted js-code.

But the engine is not only so much faster to allow shorter intervals the devs have certainly tested which is the best possible shortest interval to allow short intervals and don't blocking the browser for to long

Tobias P.
  • 4,537
  • 2
  • 29
  • 36
1

Don't know the reasoning behind jQuery's interval time, as 13ms translates to 80fps which is very fast. The "standard" fps that's used in movies and such is 25fps and is fast enough that human eye won't notice any jittering. 25fps translates to 40ms, so to answer your question: anything below 40ms is enough for an animation.

Tatu Ulmanen
  • 123,288
  • 34
  • 187
  • 185
  • Jitter is noticeable at 30 fps for games. When you've been playing at 200 fps and it drops to 40fps, it is noticeable. Then again, that might have to do with monitor refreshing... BTW, when browsers render do they use vertical sync? If so, you're pretty much locked at 50/60 fps... – Warty May 30 '10 at 20:35
  • 6
    movies are captured at 24 fps, which is slow enough to capture natural motion blur. your brain smooths everything out when watching a film. animation rendered dynamically isn't going to have the benefit of being captured from real life. – lincolnk Jun 17 '10 at 13:36