I save a datetime using C# Entity Framework, and when I load that time back from the database, the time varies from the value that I saved by 1 or more milliseconds.
Here is the C# Code:
public List<DateTime> TestDate()
{
var dates = new List<DateTime>();
DateTime testvalue = DateTime.Now;
dates.Add(testvalue);
IactexGMG2Entities firstContext = new IactexGMG2Entities();
var firstQuery = from p in firstContext.LocationProperties
where p.locationPropertyId == 4
select p;
var firstRec = firstQuery.Single();
firstRec.locationPropertyDateTime = testvalue;
firstContext.SaveChanges();
firstContext.Dispose();
IactexGMG2Entities secondContext = new IactexGMG2Entities();
var secondQuery = from p in secondContext.LocationProperties
where p.locationPropertyId == 4
select p;
var secondRec = secondQuery.Single();
var secondDate = secondRec.locationPropertyDateTime ?? DateTime.Now;
dates.Add(secondDate);
secondContext.Dispose();
return dates;
}
Here are the received values:
5/29/2015 5:43:25 PM . 154 , 635685182051540566
5/29/2015 5:43:25 PM . 153 , 635685182051530000
Here is the razor code that displays the values:
@foreach (var date in Model)
{
counter++;
<div>
@date . @date.Millisecond , @date.Ticks
</div>
}
As you can see, the second value, which was read back from the database, is lower than the first value by 1.0566 milliseconds.
The amount of variation varies, positive and negative, always with a small number of milliseconds.
Does anyone know how the conversion between the date values takes place?
Note: If I use the same context to read the date value, the values match. I assume that is because it is using the cached value, rather than the SQL Server value.