Storing a DateTime
into a database as local time is generally not a great idea, because:
If your local time zone has daylight saving time rules, then you could potentially be losing data. During the fall-back transition, there's a one-hour period where a local time could represent two possible moments. Using UTC avoids this issue.
It binds your data to a particular location. If you ever need to migrate your server to another location (or to the cloud), your past data will be inaccurate.
If you must store your data in local time, consider storing it as a DateTimeOffset
. By recording the offset, you can accurately distinguish the unique point in time it represents, if the local time zone changes. This adequately counters both of the points above.
See also: DateTime
vs DateTimeOffset
.
As to your question about JSON.Net and the DateTimeZoneHandling.Local
option, this simple example shows that it appears to be working as designed. I can't reproduce your claim.
public class Foo
{
public DateTime DateTime { get; set; }
}
...
var json = "{\"DateTime\":\"2014-01-01T00:00:00.000Z\"}";
var settings = new JsonSerializerSettings
{
DateTimeZoneHandling = DateTimeZoneHandling.Local
};
var foo = JsonConvert.DeserializeObject<Foo>(json, settings);
Debug.WriteLine("{0} ({1})", foo.DateTime, foo.DateTime.Kind);
Output:
12/31/2013 16:00:00 (Local)
My local time zone is US Pacific Time, which was 8 hours behind UTC on the date provided, which is reflected in the results.