I have a DateTime
instance with Kind = DateTimeKind.Utc
and a timespan.
var dt = DateTime.UtcNow;
var ts = TimeSpan.FromDays(1);
When I localize dt
and then add ts
I get a different result than when I add ts
and then localize, due to daylight savings.
var localizedFirst = dt.ToLocalTime() + ts; //Does account for daylight savings
var addedFirst = (dt + ts).ToLocalTime(); //Does not account for daylight savings
This seems very strange. Shouldn't adding an offset from localization and adding an offset from a timespan be commutative and associative?
I found a similar question: Why doesn't DateTime.ToLocalTime() take into account daylight savings? That question is dealing more with converting DateTime
to and from String
. I am working only with DateTime
and TimeSpan
arithmetic.
The best answer for that question suggested using DateTimeKind.Unspecified
so that the runtime will assume the unspecified date is UTC and then it will convert it properly when localizing. I was very surprised that this actually worked. If I create a new DateTime
like this:
var dt2 = new DateTime(dt.Ticks, DateTimeKind.Unspecified);
Then both orders of operations return the correct result with daylight savings.
(dt2 + ts).ToLocalTime()
dt2.ToLocalTime() + ts
This all seems absurd to me. Why do I need to convert a Utc
date to Unspecified
just to convert it to Local
properly? This seems like it should be considered a bug.
Other details:
- Framework: .NET 4.6.1
- My local timezone: Eastern Standard Time (USA)
- An actual value being used by
dt
:11/5/2017 2:36:13pm UTC
- An actual value being used by
ts
:TimeSpan.FromDays(699)
- Local equivalent of
dt
:11/5/2017 9:36:13am
- Value of
(dt + ts).ToLocalTime()
:10/5/2019 10:36:13am
- Value of
dt.ToLocalTime() + ts
:10/5/2019 9:36:13am