static void Main(string[] args)
{
double a = 222.65;
double b = 0.056124761669643426;
double c = (double) ((decimal) a*(decimal) b);
}
Why these calculations give different results on different operating systems? This part gives always the same result:
((decimal) a*(decimal) b)
After casting to double I get either:
12.496178185746102
or
12.496178185746105
My problem is that this minor change has big impact on the result and tests fail.
Now important information:
- Project is built with .NET 4.0 on the same machine.
- Both machines have got .NET 4.0 and .NET 4.5.2 installed.
- Projects are run as x86 application.
- The first result I get on machines with Windows 7, Windows Server 2003, Windows Server 2008 installed.
- The second result I get on machines with Windows Server 2012, Windows 10 installed.
- I am not sure about CLR version but I suppose it comes with .NET so should be the same.
It seems that something changed after Windows 8, Windows Server 2012 (both were released together). I've always thought that results may be impacted only by .NET version. Any ideas?
Edit: Due to misinformation from my side there is an example:
double a = 222.65;
double b = 0.056124761669643426;
double c = (double) ((decimal) a*(decimal) b);
double result = Process(c) <- doing something very complicated
Assert.That(result,expectedResult,tolerance=1E-8) <- here is an impact