We have an issue on a customer site, whereby we're importing a CSV file including two date fields (a start and finish date/time, with accuracy to seconds). The import code calculates the difference between the two dates as a TimeSpan
, then we save the TotalSeconds
to the database (in a real
field).
Works perfectly in our development environment - but for some reason, on the customer site, the time difference is making some fractional error in the calculation, such that a time difference of 123 seconds frequently shows up in the DB as 123.0001 seconds, or 122.9999 seconds. We cannot reproduce the problem here.
I recall many years ago there was some issue with Pentium processors that they were making weird floating point calculation errors (such that they were nicknamed 5.0001-ium processors), but I don't recall the details. Is it possible that there might be a similar issue on the customer site, whereby date/time calculations are being messed up by a particular kind of processor? Can you think of any other possible reasons for this odd behavior?
The code is pretty simple. I've edited out some extraneous stuff, but it goes like this:
DateTime startDate, endDate;
// set startdate and enddate by parsing from CSV file
var timeDiff = endDate.Subtract(startDate);
// and we save to the database using timeDiff.TotalSeconds