I have Unity app and Desktop app that have shared C# code, and I've encountered an inconsistency with some lines in this code. For example:
float a = 2.34567890F, b = 1.23456782F;
double d = a + b;
In the second line:
- Unity first converts a and b to doubles and then sums them
- Desktop first sums them, and then converts to double
And that's yielding different results in d
.
The question:
Is there way to prevent Unity from doing this (or make Desktop apps behave the same way)? Is there other way to workaround this inconsistency? (Of course, without checking each line and make all the explicit conversions to imitate Unity behavior)
Elaborated illustration
Consider this code:
using (System.IO.StreamWriter f = System.IO.File.CreateText(filename))
{
float a = 2.34567890F;
float b = 1.23456782F;
double d1 = a + b; // sum and then convert
f.WriteLine(BitConverter.ToString(BitConverter.GetBytes(d1)));
double d2 = (double)a + (double)b; // convert and then sum
f.WriteLine(BitConverter.ToString(BitConverter.GetBytes(d2)));
}
When I run it through Desktop Application (DotNet Core 3.1, on Windows), I'm getting different prints:
00-00-00-**40**-58-A4-0C-40
00-00-00-**50**-58-A4-0C-40
and this is understandable, due to the fact that:
- In the first line we do float-summation, and then conversion-to-double;
- While in the second we do 2 conversions-to-double, and then - double-summation.
However, in Unity Application (2019.4.11f1, on the same machine), I'm getting the second print - twice:
00-00-00-**50**-58-A4-0C-40
00-00-00-**50**-58-A4-0C-40
It seems that in both cases Unity firstly converts the 2 floats to double, and then performs double-summation.
Needless to say, there are many implicit-conversions in the code, and innocent phrases like Math.Sqrt(a+b)
(when a,b are floats) return different values in Unity and Desktop, and down the road the results are chaotically-differ from each other.