I actually test the casting behavior in C# in unchecked context. Like the documentation said, in unchecked context, the cast always succeed. But sometimes, in particular cases, the cast from one specific type to another type give unexpected result.
For example, i tested three "double to sbyte" casts :
var firstCast = (sbyte) -129.83297462979882752; // Result : 127.
var secondCast = (sbyte) -65324678217.74282742874973267; // Result : 0.
var thirdCast = (sbyte) -65324678216.74282742874973267; // Result : 0.
Just to be clear, the difference between the second and the third double is just 1
(secondDouble - firstDouble = 1
).
In this case, results of casting seem to always be 0
for any "big" double value.
My question is : why the second and the third casts result in 0
?
I searched for an answer in the C# documentation, but i did not find any.
I tested the above with the .Net Framework 4.7.2.