I have learned that SQL Server stores DateTime differently than the .NET Framework. This is very unfortunate in the following circumstance: Suppose I have a DataRow filled from my object properties - some of which are DateTime - and a second DataRow filled from data for that object as persisted in SQL Server:
DataRow drFromObj = new DataRow(itemArrayOfObjProps);
DataRow drFromSQL = // blah select from SQL Server
Using the DataRowComparer on these two DataRows will give an unexpected result:
// This gives false about 100% of the time because SQL Server truncated the more precise
// DateTime to the less precise SQL Server DateTime
DataRowComparer.Default.Equals(drFromObj, drFromSQL);
My question was going to be, 'How do other people deal with reality in a safe and sane manner?' I was also going to rule out converting to strings or writing my own DataRowComparer. I was going to offer that, in absence of better advice, I would change the 'set' on all of my DateTime properties to convert to a System.Data.SqlTypes.SqlDateTime and back upon storage thusly:
public Nullable<DateTime> InsertDate
{
get
{
if (_InsDate.HasValue)
return _InsDate;
else
return null;
}
set
{
if (!object.ReferenceEquals(null, value) && value.HasValue)
_InsDate = (DateTime)(new System.Data.SqlTypes.SqlDateTime(value));
}
}
I know full well that this would probably get screwed up as I used the _InsDate variable directly somewhere rather than going through the property. So my other suggestion was going to be simply using System.Data.SqlTypes.SqlDateTime for all properties where I might want a DateTime type to round trip to SQL Server (and, happily, SqlDateTime is nullable). This post changed my mind, however, and seemed to fix my immediate problem. My new question is, 'What are the caveats or real world experiences using the SQL Server datetime2(7) data type rather than the good, old datetime data type?'