6

Based on ideas presented in link I implemented several different "sleep methods". One of this methods was the "binary sleep", which looks like this:

while (System.currentTimeMillis() < nextTimeStamp)
{
sleepTime -= (sleepTime / 2);
sleep(sleepTime);
}

Because the check if the next time step is already reached takes place at the beginning I, would expect that the method is running too long. But the cummulative distribution of the simulation error (expected time - real time) looks like this: alt text http://img267.imageshack.us/img267/4224/errorvscummulativearran.jpg

Does somebody has an idea why I'm getting this results? Maybe the method System.currentTimeMillis() does not really return the current time?

BR,

Markus

@irreputable

When I made the evaluation I also created a bell curve by using a german statistic program. Because it was not possible to change caption, here is the english translation of all relevant items:

Häufigkeit = frequency

Fehler = error

Mittelwert = average

Std-Abw = standard deviation

alt text http://img694.imageshack.us/img694/2254/bellcurve.jpg

Community
  • 1
  • 1
Markus
  • 457
  • 1
  • 10
  • 20
  • 1
    I tried to ask a question at http://stackoverflow.com/questions/ask and I copypasted your topic subject in my new test-question and while entering the message field, I got a list of all (very) relevant topics which already answers your question. I was wondering if you actually have noticed and explored that? – BalusC Dec 14 '09 at 17:58
  • Ohhh, you are right. Maybe I should look more often at related topics before writing a question :-) – Markus Dec 14 '09 at 18:04
  • +1 for the curve. we knew it's not precise, but it's nice to see some quantification. can you make it a bell curve? – irreputable Dec 14 '09 at 19:03

3 Answers3

10

No it does not. Its young brother System#nanoTime() has a much better precision than System#currentTimeMillis().

Apart from the answers in their Javadocs (click at the links here above), this subject was discussed several times here as well. Do a search on "currenttimemillis vs nanotime" and you'll get under each this topic: System.currentTimeMillis vs System.nanoTime.

Community
  • 1
  • 1
BalusC
  • 1,082,665
  • 372
  • 3,610
  • 3,555
  • 2
    nanoTime() has better precision but it's accuracy is still the same a currentTimeMillis() (depends on the underlying operating system). See http://en.wikipedia.org/wiki/Accuracy_and_precision – Steve Kuo Dec 14 '09 at 18:07
  • Thanks for the heads up (English is not my native). I've fixed it. – BalusC Dec 14 '09 at 18:18
  • 1
    However .. Your English is also not that good. It should be "its" (of it) and not "it's" (it is) :o) – BalusC Dec 14 '09 at 18:41
2

Per the docs,

 * Returns the current time in milliseconds.  Note that
 * while the unit of time of the return value is a millisecond,
 * the granularity of the value depends on the underlying
 * operating system and may be larger.  For example, many
 * operating systems measure time in units of tens of
 * milliseconds.
Jonathan Feinberg
  • 44,698
  • 7
  • 80
  • 103
2

What you are seeing is the underlying clock resolving to 15ms resolution. This is a feature of the OS & interrupt rate. There is a patch for the linux kernel to increase this resolution to 1ms, I'm not sure about windows. There have been a number of posts about this already.

Community
  • 1
  • 1
Joel
  • 29,538
  • 35
  • 110
  • 138