5

I am working on a very time-sensitive web application. One of the business rules given to me is that the application's behavior must always depend on the time on the web server, regardless of what time is on the client's clock. To make this clear to the user, I was asked to display the server's time in the web application.

To this end, I wrote the following Javascript code:

clock = (function () {
var hours, minutes, seconds;

function setupClock(updateDisplayCallback) {
    getTimeAsync(getTimeCallback);

    function getTimeCallback(p_hours, p_minutes, p_seconds) {
        hours = p_hours;
        minutes = p_minutes;
        seconds = p_seconds;
        setInterval(incrementSecondsAndDisplay, 1000);
    }

    function incrementSecondsAndDisplay() {
        seconds++;
        if (seconds === 60) {
            seconds = 0;
            minutes++;
            if (minutes === 60) {
                minutes = 0;
                hours++;
                if (hours === 24) {
                    hours = 0;
                }
            }
        }
        updateDisplayCallback(hours, minutes, seconds);
    }
}

// a function that makes an AJAX call and invokes callback, passing hours, minutes, and seconds.
function getTimeAsync(callback) {
    $.ajax({
        type: "POST",
        url: "Default.aspx/GetLocalTime",
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        success: function (response) {
            var date, serverHours, serverMinutes, serverSeconds;
            date = GetDateFromResponse(response);
            serverHours = date.getHours();
            serverMinutes = date.getMinutes();
            serverSeconds = date.getSeconds();
            callback(serverHours, serverMinutes, serverSeconds);
        }     
    })
}

return {
    setup: setupClock
};
})();

The function passed in for updateDisplayCallback is a simple function to display the date on the web page.

The basic idea is that the Javascript makes an asynchronous call to look up the server's time, store it on the client, and then update it once per second.

At first, this appears to work, but as time goes by, the displayed time gets behind a few seconds every minute. I left it running overnight, and when I came in the next morning, it was off by more than an hour! This is entirely unacceptable because the web application may be kept open for days at a time.

How can I modify this code so that the web browser will continuously and accurately display the server's time?

Vivian River
  • 31,198
  • 62
  • 198
  • 313
  • Would it be an option to make the ajax-call once every minute or so, to sync with the server? – Christofer Eliasson Mar 29 '13 at 14:21
  • keep in mind that request to server takes time so you are adding several hundred ms every time you make a call. – charlietfl Mar 29 '13 at 14:22
  • setInterval is NOT ACCURATE. Make a difference from the server time and the time on the computer clock. Add that difference to the user's clock. Also remember latency of the http calls can cause the time to be off a great deal. It takes time to travel the tubes. – epascarello Mar 29 '13 at 14:22
  • i think you should poll the server at some frequency, you should be able to find a suitable interval through trial & error, you could use setInterval for the polling operation as the results will be in reasonably accurate (less the response transit time) – Jason Mar 29 '13 at 14:22
  • How long is each request taking? performing ajax requests every second is usually a bad idea due to the fact that a single ajax request can take more than a second to complete. – Kevin B Mar 29 '13 at 14:22
  • Calculate the difference in time between the client's clock and the server clock, then use that, sending an ajax request every minute or so to ensure it's still on track. Don't forget to take into account the time it takes for the ajax request to complete in your calculation. – Kevin B Mar 29 '13 at 14:26

3 Answers3

6

Javascript's setInterval is not accurate enough to allow you to keep the time like this.

My solution would be:

  1. Periodically get the server's time in milliseconds (it does not need to be very often as the two clocks will hardly deviate that much)
  2. Get the client time in milliseconds
  3. Calculate the clock deviation between server and client (client-server)
  4. Periodically update the display of the clock by getting the client time and adding the clock deviation

Edit: To be more accurate, you could measure the round trip time of the server's request, divide it by 2 and factor that delay into the clock deviation. Assuming round trips are symmetrical in their duration, this would give a more accurate calculation.

Jorge Cardoso
  • 325
  • 1
  • 11
  • 1. ... BUT the time for the Ajax http request to the server will differ greatly between calls based on network traffic. – epascarello Mar 29 '13 at 14:28
  • This seems to be a much more elegant solution, since it means you are not writing your own clock in `incrementSecondsAndDisplay()` but rather you're using the built in `Date` on the client. – Aram Kocharyan Mar 29 '13 at 14:30
  • @epascarello you would use the client time when the server response arrives of course, not when the request is sent. – Aram Kocharyan Mar 29 '13 at 14:30
  • @epascarello You are right. To be more accurate, you could measure the round trip time of the server's request, divide it by 2 and factor that delay into the clock deviation. Assuming round trips are symmetrical in their duration, this would give a more accurate calculation. – Jorge Cardoso Mar 29 '13 at 14:39
  • @AramKocharyan, I know that, the response back from the server is not instant. Just saying you are not going to get 100% accurate representation of time which that OP thinks they will get. – epascarello Mar 29 '13 at 14:40
2

setInterval is not a reliable way to schedule time critical events. It may take less or more than 1000ms to run your callback depending on how busy JavaScript it is at the moment.

A better approach would be to take a shorter interval and use new Date().getTime() to check if a second has passed.

The minimum interval browsers allow is as high 10.

Halcyon
  • 57,230
  • 10
  • 89
  • 128
  • There are numerous web pages out there that do display time, so it seems that there ought to be a best practice for this. – Vivian River Mar 29 '13 at 14:24
  • It'll never take less time than specified. Also, the minimum interval is not 10ms. It depends on the browser. – Robin Drexler Mar 29 '13 at 14:25
  • 1
    I would calibrate every minute to avoid issues regardless of which method I used. – Aram Kocharyan Mar 29 '13 at 14:26
  • @Robin Frits said the minimum interval is **as high** as 10 ms (which may or may not be accurate, I don't know). That doesn't imply it's always 10 ms. – jerry Mar 29 '13 at 14:27
0

Thanks for the answers. I have up-voted both answers so far as they contain useful information. However, I am not using the exact answer prescribed in either answer.

What I finally decided on is a bit different.

I wrote about what I learned on my personal web page.

First of all, I now understand that using setInterval(..., 1000) is not good enough to have something done once per second for a long time. However, 'polling' the time with a much shorter interval looking for the second to change seems very inefficient to me.

I decided that it does make sense to keep track of the 'offset' between the server time and the client time.

My final solution is to do the following:

(1) Do an AJAX call to the server to get the time. The function also checks the client time and computes the difference between the server time and the client time, in milliseconds. Due to network latency and other factors, this initial fetch may be off by a few seconds. For my purposes, this is okay.

(2) Execute a tick function. Each time tick executes, it checks how long it has been since the last time tick executed. It will use this time to compute an argument to be passed to setTimeout so that the time display is updated approximately once per second.

(3) Each time the tick function computes the time to be displayed, it takes the client time and adds the difference that was computed in step (1). This way, I don't depend upon the client to have the time set correctly, but I do depend upon the client to accurately measure elapsed time. For my purposes, this is okay. The most important thing is that regardless of how setTimeout may be inaccurate or interrupted by other processes (such as a modal dialog, for instance), the time displayed should always be accurate.

Vivian River
  • 31,198
  • 62
  • 198
  • 313
  • Don't forget that the user time may jump unexpectedly. This could be initiated manually by the user, by an NTP client, or due to a timezone/daylight saving time change. Assuming at least some of the users won't be in the same place as your server, the last one is especially problematic since the time changes by a large increment. It is also more likely to happen since your app is a long running one. Fortunately, it's also solvable without making extra server calls by simply using getTimezoneOffset() in addition to getTime(). – jerry Mar 30 '13 at 15:11