I've been banging my head against the wall all day trying to figure why this is happening and how to fix it. I am building a C# .NET MVC web application. I call a query using Linq from the Database like this:
calibHistory = db.CalibrationHistory
.Where(d => d.ID == EquipmentId && d.ChangeNum == maxChangeNum).First();
And get the C# double type (equivalent to a float in SQL) with the following precision:
2.3980000019073486
I take this object and bind it to a form in a partial view. This works and is fine. However, something happens in the Razor rendering that causes the following value to be rounded off to this:
2.39800000190735
Obviously, Razor is rounding off the last two digits (the last two digits up the 4 to a 5). Debugging shows that Razor knows the correct value - in the model where the double is held, you see the double with full precision. Here's where things get interesting, however: a call in SQL Server Management Server shows that the Razor value has the correct precision, not the double returned by Linq.
So somehow, Linq is getting more precision than what is being provided by SQL Server. I want to be able to compare the posted values from the form against the Linq query above to see if there are changes. I haven't been able to figure this out so any help would be greatly appreciated.
TL;DR: Linq is returning more precision for doubles and messing up equality measures. Would like to either fix the Linq query or make Razor show the full precision.