I'm sorting an array of objects. The objects have lots of fields but I only care about one of them. So, I wrote a comparator:
Collections.sort(details, new Comparator<MyObj>() {
@Override
public int compare(MyObj d1, MyObj d2) {
if (d1.getDate() == null && d2.getDate() == null) {
return 0;
} else if (d1.getDate() == null) {
return -1;
} else if (d2.getDate() == null) {
return 1;
}
if (d1.getDate().before(d2.getDate())) return 1;
else if (d1.getDate().after(d2.getDate())) return -1;
else return 0;
}
});
From the perspective of my use case, this Comparator
does all it needs to, even if I might consider this sorting non-deterministic. However, I wonder if this is bad code. Through this Comparator
, two very distinct objects could be considered "the same" ordering even if they are unequal objects. I decided to use hashCode
as a tiebreaker, and it came out something like this:
Collections.sort(details, new Comparator<MyObj>() {
@Override
public int compare(MyObj d1, MyObj d2) {
if (d1.getDate() == null && d2.getDate() == null) {
return d1.hashCode();
} else if (d1.getDate() == null) {
return -1;
} else if (d2.getDate() == null) {
return 1;
}
if (d1.getDate().before(d2.getDate())) return 1;
else if (d1.getDate().after(d2.getDate())) return -1;
else return d1.hashCode() - d2.hashCode();
}
});
(what I return might be backwards, but that's is not important to this question)
Is this necessary?
EDIT: To anyone else looking at this question, consider using Google's ordering API. The logic above was replaced by:
return Ordering.<Date> natural().reverse().nullsLast().compare(d1.getDate(), d2.getDate());