How does default Equals()
work on struct like this one:
public struct Decim
{
decimal x;
public Decim (decimal x)
{
this.x = x;
}
}
new Decim(-0m).Equals (new Decim(0m));
return true, why? if it doing bitwise comparsion, I thought decimal uses special bit to indicate sign
also new Decim(5.00m).Equals (new Decim(5.000000m));
reports true, but when I do new Decim(5.00m).ToString()
and new Decim(5.000000m)).ToString()
it produces different values. How does to string knows it?