12

If my user is in California and they have their computer set to PST, it's 1:00 pm there. If my server is set to EST, the current server time is 4:00 pm.

I need a way to get the timezone difference between the client and the server, either in Javascript or C#. In my example, I would get 3 (or -3, doesn't matter).

Does anyone know how to do this?

EDIT: Possible solution for RedFilter

Doing it all in javascript:

serverDate = new Date('<%= DateTime.Now.ToString() %>');
clientDate = new Date();
diffMin = (serverDate.getTime()-clientDate.getTime())*1000*60;  //get difference in minutes

Think that would work? Or would both of those return the same time?

Brian Gideon
  • 47,849
  • 13
  • 107
  • 150
Steven
  • 18,761
  • 70
  • 194
  • 296
  • 2
    If the server is set to EST, something is wrong. It should run in UTC. That will remove so many headaches related to time zones. – Fredrik Mörk Sep 01 '10 at 19:30
  • 1
    @Frederik I don't see how that really changes things. If the application needs to know the time offset of the client, it needs to know regardless of the server timezone setting. – Pointy Sep 01 '10 at 19:32
  • My server is actually running in UTC, I just used EST as an example to avoid all the "there are functions to convert dates to UTC" answers, which wouldn't help with the problem I'm having. – Steven Sep 01 '10 at 19:33
  • I'd advocate storing data (when in data structures or database) as UTC times, but setting the entire server to UTC is a little much. – Thanatos Sep 01 '10 at 19:34
  • Also, keep in mind that 1) there are timezones with half-hour UTC offsets 2) The difference in time between two timezones changes throughout the year. – Thanatos Sep 01 '10 at 19:37
  • @Steven (and @Pointy): it does make a difference. All time zones have an offset related to UTC, not to each other. The `TimeZoneInfo` class has a `BaseUtcOffset` property. By comparing this for two different `TimeZoneInfo` instances, you can calulate the difference between the time zones. Furthermore, the `DateTime` structure has methods for converting between local time and `UTC` but not to other random time zones. So I would till say, making sure that all time on the server is UTC will greatly simplify things. That said, the context of @Stevens code could naturally be completely different. – Fredrik Mörk Sep 01 '10 at 19:40
  • @Fredrik: I thought Windows used UTC internally and setting the timezone only affected how times were displayed or if an application requested the time in a local timezone, but everything was still happening in UTC behind the scenes. – Brian Gideon Sep 01 '10 at 19:53
  • @Fredrik: The only problem with that is that I don't know how to get the client's timezone. If I could do that, I could compare the BaseUtcOffset of that timezone with the server timezone. Do you know how to get the client's timezone? – Steven Sep 01 '10 at 20:09
  • @Steven: From context clues in your update I got the feeling that your question is related to asp.net so I added that tag. Feel free to roll back. I deleted my answer as well on the same grounds. – Brian Gideon Sep 01 '10 at 20:12
  • @Steven: you may get some pointers here: http://stackoverflow.com/questions/338482/can-you-determine-timezone-from-request-variables/388219#388219 – Fredrik Mörk Sep 01 '10 at 20:39
  • @Fredrik, running a server in something other than UTC is a good way to catch someone coding webserver code using local times. Esp. if you live somewhere where UTC and localtime coincide for some of the year. – Jon Hanna Sep 01 '10 at 21:42

4 Answers4

13

You could:

1 - Return the server date to the client as a Javascript date variable.
2 - Create a new javascript date client side (var currentTime = new Date();) and subtract the above date
3 - Post the result back to the server (if necessary; you may only need to know the difference client-side).

Update

Here is an example:

serverDate = new Date('<%= DateTime.Now.ToString() %>'); 
clientDate = new Date(); 
diffMin = (serverDate.getTime()-clientDate.getTime())/(1000*60);
alert("serverDate: " + serverDate + "\r\n" + "clientDate: " + clientDate + "\r\n" +
  "diffMin: " + diffMin);

If the server and client are on the same machine, you will see a diffMin approaching zero. There is a slight difference between the dates due to the time between the server-side script generating the date and the browser parsing and executing the javascript.

//This was useful for me - DateTime.Now.ToString("yyyy-MM-ddTHH:mm:ss")

Arun Prasad E S
  • 9,489
  • 8
  • 74
  • 87
D'Arcy Rittich
  • 167,292
  • 40
  • 290
  • 283
1

Do you need to know the timezone of the location or the machine?

If you need to know the timezone of the location then your best bet is to subscribe to a Geo-IP service, and check the IP, then checking on the timezone for that location (there are publicly available databases for that). It's not guaranteed as IP geographic information is not guaranteed (and that's definitely not just a theoretical lack of guarantee, mis-information abounds).

Often though, what you really want is the client machine's timezone setting. For most services I would find it annoying if I was travelling and had a website think of me to be in a different time to that I was working in (I stick to my home timezone if I'm not out of it for long).

This is easily done client side. new Date().getTimezoneOffset() returns the number of minutes between UTC and local time. E.g. currently I'm in Irish Summer Time (GMT + 1hour daylight saving time), and it returns -60.

You can easily put that in a URI used by an image or XHR request, or put it in a cookie value.

Jon Hanna
  • 110,372
  • 10
  • 146
  • 251
1

This problem has been plaguing me: I'm having a similar issue but I'm doing it server-side and running a node.js AWS Lambda. I cannot accept that I need to pass the time from the client to the server (or visa-versa should that be your case). I know the timezone, javascript knows the rules for converting them, why should I need to pass it back and forth? What I did just calculates the difference between the two timezones for whatever the date is. Below, it is set up to convert between UTC and timezones in the US, which never have partial hours so I have it set to round to an integer of hours, but you'll need to re-work this if you're ever working with wonky timezones like India, etc. You need to round because there are extra milliseconds leftover (which I suspect has to do with the decay rate of Cesium and not a rounding error--not sure don't care).

Note that this is running server-side (but the reverse will obviously work) and that my server runs UTC. Also note that when you initialize the first date variable, you need to set the date to the date you want the offset for because they can change day-to-day.

var UTCTime = new Date();
var pacificTime = new Date(UTCTime.toLocaleString("en-US",{timeZone: "America/Los_Angeles"}));
var offset = Math.round((UTCTime-pacificTime)/1000/60/60,0);
console.log(offset); // returns 7 (during daylight saving time)

Hope this helps someone...

Travis
  • 316
  • 1
  • 13
0

One good way to check for timezone difference, is to know the timezone (location) where the time is taken. If you can obtain that on the client side and the server side you can check the TimeZoneInfo class (needs 3.5 I think) TimeZoneInfo from Koders.

Convert the client and server time with the associated zone to UTC (ConvertTimeZoneToUtc) and compare the two.

Khan
  • 516
  • 11
  • 24
  • That's the problem, I don't know the timezone that the client will be navigating from, it could be from anywhere. If I could get that info somehow, that would be helpful, but I don't know how. – Steven Sep 01 '10 at 20:07
  • Then I would like to know how you obtain the time from the client. Cause I have a similar problem and I resolved it by using the location of all the contacts. I use cities location associated with a timezone. – Khan Sep 01 '10 at 20:27