Possible Duplicate:
Python: removing duplicates from a list of lists
Say i have list
a=[1,2,1,2,1,3]
If all elements in a are hashable (like in that case), this would do the job:
list(set(a))
But, what if
a=[[1,2],[1,2],[1,3]]
?
Possible Duplicate:
Python: removing duplicates from a list of lists
Say i have list
a=[1,2,1,2,1,3]
If all elements in a are hashable (like in that case), this would do the job:
list(set(a))
But, what if
a=[[1,2],[1,2],[1,3]]
?
Python 2
>>> from itertools import groupby
>>> a = [[1,2],[1,2],[1,3]]
>>> [k for k,v in groupby(sorted(a))]
[[1, 2], [1, 3]]
Works also in Python 3 but with caveat that all elements must be orderable types.
This set comprehension works for the List of Lists to produce a set of tuples:
>>> {(tuple(e)) for e in a}
set([(1, 2), (1, 3)])
Then use that to turn it into a list of lists again with no duplicates:
>>> [list(x) for x in {(tuple(e)) for e in a}]
[[1, 2], [1, 3]]