0

Following up with my last question, I'm curious how XmlSerializer converts DateTime.

I have a field in my XML file that looks like so:

<date>2011-01-10T00:00:00-05:00</date>

I'd like to deserialize this to a DateTime.

This is EST by the looks of it. However, when I go to look at the result I get a datetime object that looks like 2011-01-09 21:00:00. This seems like it's converting to my local time (PST).

  1. Why is this happening?
  2. How can I preserve the actual date being given in the XML during serialization? I'd prefer to keep UTC where possible during deserialization.

Thank you!

EDIT

I do not have control over the server nor the ability to change its format.

  • Try something like this https://stackoverflow.com/a/3534625 – Raj Jan 27 '18 at 03:17
  • This is clearly not formatting, however, it is hard to determine how to proceed. Where did the xml come from, perhaps a server that was set to EST. Can you ask for the payload to be in UTC with no offsets? – Ross Bush Jan 27 '18 at 04:29
  • What is your server set to? UTC helps tons with datetime math. I wished everyone stored their data in UTC, it would make things easy. – Ross Bush Jan 27 '18 at 04:37
  • I don't have control over the server sending me this. It is a third party. –  Jan 27 '18 at 04:39
  • Can you ask for a time zone designator (there is no z on that date string)? – Ross Bush Jan 27 '18 at 05:03

1 Answers1

-2

DateTime displays values in local time. It does not support preserving time zone when read from absolute time. So you see ISO-8601 formatted value from -05:00 timezone properly converted to your local timezone.

The only options for serialization back are UTC (Z timezone) and local, see Force XmlSerializer to serialize DateTime as 'YYYY-MM-DD hh:mm:ss' for a way to choose format you want.

If you need to preserve timezone see if some external libraries like NodaTime supports it.

Alexei Levenkov
  • 98,904
  • 14
  • 127
  • 179
  • DateTime is always stored in a PC as a number in UTC. When converting reading or writing time the software in the computer automatically converts to a string using the computer Time Zone setting. – jdweng Jan 27 '18 at 08:58
  • @jdweng I'm not sure about that... I thought that MB clock is local and OS need to know timezone to move it to UTC (and https://superuser.com/questions/482860/does-windows-8-support-utc-as-bios-time seem to say local is default). I'm not an expert in it so your comment very well could be correct, but not sure how it is related to how to parse time with timezone. – Alexei Levenkov Jan 27 '18 at 19:34
  • Always stored in UTC and transferred between computers as UTC when transferred as a number. Only when data is transferred as a string is igt sent with local time along with a Timezone setting. Suppose you have a database and a user logs in from another timezone. How is the remote connection to know what the database timezone is. It is just reading a time that is UTC. – jdweng Jan 27 '18 at 20:03
  • @jdweng :) you mean "it is *good practice* to use UTC (or any other date time format that explicitly includes time zone offset) to store and transfer absolute date time values" - totally agree if that what you mean... The way you phrased your comment looks like some absolute truth (i.e. set in hardware/built into all protocols) which is not the case and people send local date time without timezone info all the time (complaining later that nothing works). – Alexei Levenkov Jan 27 '18 at 20:08