Jens's solution will work, but it has a rather slow running time of O(n2).
A better alternative if you have a long list is to
- Sort the list
- Compare every item with its successor.
This has a running time of O(n log n) for the quicksort + O(n) for the search for a total running time of O(n log n).
See the following pseudo code (don't have access to Delphi now).
function RemoveDuplicates(const Input: IEnumerable<Integer>): IEnumerable<Integer>;
var
List: IList<Integer>;
i: integer;
begin
List := TCollections.CreateList<Integer>;
List.Assign(Input); //Copy input list to output.
List.Sort;
for i:= List.Count-1 downto 1 do begin
if List[i] = List[i-1] then List.delete(i);
//if Comparer<T>.Equals(List[i], List[i-1]) then ....
end; {for i}
end;
Problems
The problem with this approach is that the output (might) have a different order from the input. This may or may not be a problem.
Benefits (or why the dictionary sucks)
If the sorting is a cheap operation this will be the fastest approach.
The use of a dictionary carries high constant cost for the hashing.
Even though the hashing operation is O(1), it can get very expensive for large keys, because the hash will always process the whole key, whereas sorting comparison will stop as soon a a difference is detected.
Further note that hashing is a much more expensive operation than a simple comparision (about 30x to 100x slower)!
Only when the list is huge does the better asymptotic running time of the dictonary kick in.