0

I've been trying to figure this out for the past few days but only losing my hair over it.

I'm using the javascript below for a timer that counts up using a timestamp as the start date. I'm using a timestamp from the database, which is saved at the very moment that the timer is started, because that timestamp is saved relative to the user's timezone, which is selected in their profile.

My server php timezone is set to 'America/Los_Angeles'. My user profile timezone is also set to that. That is also the format in which the timezone is saved.

However when I run the timer, even though my php displays the correct saved time based on the saved timezone, ie. '05/29/2013 6:05:49', the timer has additional hours from the get-go. So it would look like "0 days 12 hours 0 mins 13 secs" instead of "0 days 0 hours 0 mins 13 secs".

This must be due to the javascript timezone??, since if I set my TZ to 'America/New_York' for instance, it would be "0 days 9 hours 0 mins 13 secs", correct?

How would I fix this?

JS

<script type="text/javascript">
function DaysHMSCounter(initDate, id){
    this.counterDate = new Date(initDate);
    this.container = document.getElementById(id);
    this.update();
}

DaysHMSCounter.prototype.calculateUnit=function(secDiff, unitSeconds){
    var tmp = Math.abs((tmp = secDiff/unitSeconds)) < 1? 0 : tmp;
    return Math.abs(tmp < 0 ? Math.ceil(tmp) : Math.floor(tmp));
}

DaysHMSCounter.prototype.calculate=function(){
    var secDiff = Math.abs(Math.round(((new Date()) - this.counterDate)/1000));
    this.days = this.calculateUnit(secDiff,86400);
    this.hours = this.calculateUnit((secDiff-(this.days*86400)),3600);
    this.mins = this.calculateUnit((secDiff-(this.days*86400)-(this.hours*3600)),60);
    this.secs = this.calculateUnit((secDiff-(this.days*86400)-(this.hours*3600)-(this.mins*60)),1);
}

DaysHMSCounter.prototype.update=function(){ 
    this.calculate();
    this.container.innerHTML =
    " <strong>" + this.days + "</strong> " + (this.days == 1? "day" : "days") +
    " <strong>" + this.hours + "</strong> " + (this.hours == 1? "hour" : "hours") +
    " <strong>" + this.mins + "</strong> " + (this.mins == 1? "min" : "mins") +
    " <strong>" + this.secs + "</strong> " + (this.secs == 1? "sec" : "secs");
    var self = this;
    setTimeout(function(){self.update();}, (1000));
}

window.onload=function(){ new DaysHMSCounter('05/29/2013 6:05:49', 'timer'); }

</script>

HTML

<div id="timer"></div>

2 Answers2

0

Your problem is that you are passing in the date in text format without the timezone. Thus the date is created in your local timezone.

If you instead have your server send the number of milliseconds past epoch (instead of string formatted date), then you can use new Date( millisPastEpoch ) and all should be good.

Alternatively, if you are guaranteed to know the user's timezone, then you can send the date formatted in the user's timezone instead of the server's.

Shadow Man
  • 3,234
  • 1
  • 24
  • 35
  • Would that be the same as unix timestamp? That is how it is saved to the db. – dragonfeet2012 May 30 '13 at 01:56
  • Many systems store timestamps as a number of milliseconds past an epoch (even javascript does internally). And many (if not all) conform to using "00:00:00 January 1, 1970, UTC" as the epoch. Some databases store `DATETIME` internally as strings however. But if you have saved it as the epoch already, then yes, send that. – Shadow Man May 30 '13 at 01:58
  • @dragonfeet2012 "Unix timestamp" usually refers to *seconds*. You need *milliseconds*. You can multiply by 1000 if you like. – Matt Johnson-Pint May 30 '13 at 02:01
0

First, you need to consider that you are interested in starting your timer from a specific moment in time. You cannot represent that with a local calendar date time like 05/29/2013 6:05:49 because of a few things:

  • The user will likely be in a different time zone
  • The user may use a different date format (MM/DD/YYYY vs DD/MM/YYYY)
  • The time may be ambiguous due to Daylight Saving Time transitions

So instead, you should pass a UTC value to your user. The browser will automatically take care of adjusting for the user's local time zone.

You can represent that value as an integer number of millseconds past Jan 1st 1970 UTC, or you can represent it in a standardized format, such as ISO8601, such as either 2013-05-29T13:05:49Z or 2013-05-29T06:05:49-07:00. PHP does have facilities to produce any of those representations. For example, you can use gmdate. See this answer for an example.

If you choose to go with the ISO8601 format, you should be aware that it was a late addition to JavaScript, and is not supported natively by all browsers. Most newer browsers do support it, but if you want the best compatibility, you will need to use a library such as moment.js (which offers many other great features as well).

If you go with an integer value, you won't need a library and can pass it directly to the Date constructor, but you may find it more difficult to debug.

Also, for reasons mentioned in this post, you might want to set your server time zone to UTC instead of America/Los_Angeles.

Community
  • 1
  • 1
Matt Johnson-Pint
  • 230,703
  • 74
  • 448
  • 575
  • I set my server to UTC through the htaccess, then have all data converted and represented on the front-end based on their timezone. Timer seems to work now! Thank you for the suggestion. – dragonfeet2012 May 30 '13 at 23:52