While reading Jeffrey Richter's CLR via C# 4th edition (Microsoft Press), the author at one point states that while Object.Equals
currently checks for identity equality, Microsoft should have implemented the method like this:
public class Object {
public virtual Boolean Equals(Object obj) {
// The given object to compare to can't be null
if (obj == null) return false;
// If objects are different types, they can't be equal.
if (this.GetType() != obj.GetType()) return false;
// If objects are same type, return true if all of their fields match
// Because System.Object defines no fields, the fields match
return true;
}
}
This strikes me as very odd: every non-null object of the same type would be equal by default? So unless overridden: all instances of a type are equal (e.g. all your locking objects are equal), and return the same hash code. And assuming that the ==
on Object
still checks reference equality, this would mean that (a == b) != a.Equals(b)
which would also be strange.
I think the idea of things being equal if it is the exact same thing (identity) is a better idea than just making everything equal unless overridden. But this is a well known book's 4th edition published by Microsoft, so there must be some merit to this idea. I read the rest of the text but could not help but wonder: Why would the author suggest this? What am I missing here? What is the great advantage of Richter's implementation over the current Object.Equals implementation?