1

I have a C# time application where the user can create a timer and start it, it will do something like that on the server:

(the dates are stored in CEST (the problem is summer and winter time))

// insert into db
var entry = new TimeEntry();
entry.Start = DateTime.Now; (04/03/2013 12:00)

_timeEntries.Add(entry);    // => now it's in the db

if he stops it then it do something like that:

// update the entry
var entry = _timeEntries.Get(1)    // 1 is the id from the created entry before
entry.End = DateTime.Now; (04/03/2013 12:37)

_timeEntrires.Update(entry);    // => now it updates it in the db

Something like that will be shown for the user in the browser:

TimerId: 1

Start: 04/03/2013 12:00

Stop: 04/03/2013 12:25

Total (in min): 25

Now I think there will be some problems when the user changes his timezone. What's the best way to solve this? Store the dates as UTC and then convert with JavaScript?

Thanks for your helps :-).

tereško
  • 58,060
  • 25
  • 98
  • 150
Don Merdino
  • 73
  • 1
  • 4
  • 1
    _"Now I think there will be some problems when the user changes his timezone"_ - why do you think that? – CodeCaster Apr 03 '13 at 10:46
  • Is it necessary to show the date and time in the browser? is it enough to show the amount of time used for the operation like Stackoverflow does it e.g. they show something like: asked 4 mins ago. If you still need to respect the client date and time format then you can get the localization information from the client HTTP request, then format the information on the server. – Siraf Apr 03 '13 at 10:51
  • @CodeCaster because the dates are saved in CEST and the users has different timezones, I'm wrong? – Don Merdino Apr 03 '13 at 11:07
  • Yes, you determine the date on your server. Client's settings don't matter. – CodeCaster Apr 03 '13 at 11:23

2 Answers2

0

From : Determine a User's Timezone

You will have to pass the offset from the client to the server like this:

new Date().getTimezoneOffset()/60;

getTimezoneOffset() will subtract your time from GMT and return the number of minutes. So if you live in GMT-8, it will return 480. To put this into hours, divide by 60. Also, notice that the sign is the opposite of what you need -- it's calculating GMT's offset from your time zone, not your time zone's offset from GMT. To fix this, simply multiply by -1.

Community
  • 1
  • 1
Mortalus
  • 10,574
  • 11
  • 67
  • 117
0

Since these are event times, you should be working strictly with UTC, and taking that value from the server.

When the client application calls you with whatever you're using for start and stop input commands, use DateTime.UtcNow() to get the time and save that in the database.

You really don't want to use local time at all for the recording of these times. Otherwise, if the start or end fall within an ambiguous period (such as a DST/SummerTime "fall-back" transition), you would have no way of knowing if the event fell before or after the transition.

Context always matters with DateTime questions, and in the context of "event times" - your only reliable options are UTC DateTime, or DateTimeOffset. Local calendar times (client or server) are not good for specific event times. They are useful in other contexts though.

When you actually show the data back to your user, if you're just showing the duration between start and end, then you can simply calculate it from the UTC times you recorded. If you want to show the specific times back to the user, that's when you would convert the UTC time to whatever time zone the user wants to display it in. If you opt for DateTimeOffset, the advantage is that you don't have to convert at this stage.

Community
  • 1
  • 1
Matt Johnson-Pint
  • 230,703
  • 74
  • 448
  • 575