Possible Duplicate:
C# Why can equal decimals produce unequal hash values?
I've come across an issue in my .NET 3.5 application (x86 or x64, I've tried both) where decimals with a different number of trailing zeros have different hash codes. For example:
decimal x = 3575.000000000000000000M;
decimal y = 3575.0000000000000000000M;
Console.WriteLine(x.GetHashCode());
Console.WriteLine(y.GetHashCode());
Console.WriteLine(x == y);
Console.WriteLine(x.GetHashCode() == y.GetHashCode());
Outputs the following on my machine:
1085009409
1085009408
True
False
I presume the difference in hash codes is down to the different internal representations of the two numbers caused by the differing scale factors.
Whilst I can work around the issue by removing the trailing zeros I always assumed that GetHashCode should return the same value for x and y, if x == y. Is this assumption wrong, or is this a problem with Decimal.GetHashCode?
EDIT: To be clear on versions I'm using Visual Studio 2008 SP1, .NET 3.5.