I'm experiencing some rounding issues between .NET Core 3.0 and .NET Framework/.NET Core 2.x.
I've been searching on the web for a while, but I couldn't find the right term to search for, so i'm posting it here.
I wrote the following sample console app to illustrate my problem:
class Program
{
static void Main(string[] args)
{
const double x = 123.4567890 / 3.14159265358979;
Console.WriteLine(x);
const double y = 98.76543210 / 3.14159265358979;
Console.WriteLine(y);
const double z = 11.2233445566778899 / 3.14159265358979;
Console.WriteLine(z);
Console.ReadKey();
}
}
I ran this program on different frameworks and got the following output:
- .NET Framework 4.7.2
- 39,2975164552063
- 31,4380134506439
- 3,57250152843761
- .NET Core 2.0:
- 39,2975164552063
- 31,4380134506439
- 3,57250152843761
- .NET Core 3.0:
- 39,2975164552063
- 31,438013450643936
- 3,5725015284376096
As you can see, the 3.0 output differs from the first two, and has got more precision starting from the 13th number after the floating point.
I assume that the precision of .NET Core 3.0 is more accurate.
But my case is that I want to migrate from .NET Framework to .NET Core 3.0. Before migrating, I wrote tests for the .Net Framework library to make sure the calculations will give the same output after migrating to .NET Core 3.0 . For that, I just wrote tests like:
//Arrange
const double expectedValue = 0.1232342802302;
//Act
var result = Subject.Calculate();
//Assert
result.Should.Be(expectedValue);
If I migrate the code and run the tests, which I wrote to the .NET Framework, the tests will fail. I got minor differences like
Expected item[0] to be 0.4451391569556069, but found 0.44513915698437145.
Expected result to be -13.142142181869094, but found -13.142142181869062.
My question here is; how do I force to round .NET Core 3.0 in the same way as .NET Framework/.NET Core 2.0 does, so I won't get these minor differences.
And could anyone explain this difference / describe the changes of rounding in .NET Core 3.1 versus .NET Framework?