27

Duplicates:

Lets say I have a list with nested lists:

[["a","b","c"], ["d","e","f"], ["g","h","i","j"]...]

what is the best way to convert it to a single list like that

["a", "b", "c", "d", "e"....]
Community
  • 1
  • 1
user1040563
  • 5,121
  • 9
  • 34
  • 36
  • While this is probably a duplicate of _something_, it's not a duplicate of the linked question, which is about creating a list like `[["a", "d", "g"], ["a", "d", "h"], ["a", "d", "i"], ...]` which is not at all what is wanted here. – agf Dec 22 '11 at 20:44
  • 1
    @agf replaced possible duplicates with actual duplicates – dbr Jan 15 '12 at 11:58

3 Answers3

44

Use itertools.chain:

from itertools import chain

list(chain.from_iterable(list_of_lists))
agf
  • 171,228
  • 44
  • 289
  • 238
  • 6
    You could explain more. – GLHF May 15 '16 at 10:52
  • 6
    @GLHF It's a straightforward question (as well as being over four years old) and doesn't really need more explanation than is in the docs I linked to and the example I gave. – agf May 15 '16 at 23:44
27

There's a straight forward example of this in the itertools documentation (see http://docs.python.org/library/itertools.html#recipes look for flatten()), but it's as simple as:

>>> from itertools import chain
>>> list(chain(*x))
['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j']

Or, it can be done very easily in a single list comprehension:

>>> x=[["a","b","c"], ["d","e","f"], ["g","h","i","j"]]
>>> [j for i in x for j in i]
['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j']

Or via reduce():

>>> from operator import add
>>> reduce(add, x)
['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j']
Austin Marshall
  • 2,991
  • 16
  • 14
3

An alternative solution to using itertools.chain would be:

>>> li = [["a","b","c"], ["d","e","f"], ["g","h","i","j"]]
>>> chained = []
>>> while li:
...     chained.extend(li.pop(0))
... 
>>> chained
['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j']

EDIT: The above example will consume your original lists while building the new one, so it should be an advantage if you are manipulating very large lists and want to minimise memory usage. If this is not the case, I would consider using itertools.chain more pythonic way to achieve the result.

mac
  • 42,153
  • 26
  • 121
  • 131
  • thank you. i tried using list(itertools.chain(*a)) but this message always comes up TypeError: object of type 'NoneType' has no len() when i tried it on a simple list like the one above it worked fine but when i tried it on a longer more complex one it showed me this error. maybe it has something to do with the favt that there are many empty nested list ([])? – user1040563 Nov 30 '11 at 15:02
  • @user1040563 What have you tried exactly? To me both solutions (mine and agf's work fine even with empty sublists...). – mac Nov 30 '11 at 15:10
  • This has higher time complexity than `chain` or similar solutions because `pop(0)` is O(n) -- See the "delete item" entry of the "list section of the [Python Wiki Time Complexity](http://wiki.python.org/moin/TimeComplexity) page. If you wanted to use `extend` and have the whole thing be linear time, it would be just `for sublist in li: chained.extend(li)` -- there is no way to do it in linear time from a `list` in Python without extra storage space (which I assume is what you were trying to avoid). – agf Nov 30 '11 at 15:28
  • @agf - Indeed I thought that consuming the original list while building the new one was the "added value" of this solution... Thank you for the link to the time complexity page. I ignored its existence. Interesting! :) – mac Nov 30 '11 at 15:36
  • I disagree this version being good for very large lists. It's definitely bad if the sublists are short but there are lots of them because of the O(total length * number of sublists) complexity. It's possibly good because of the memory savings if the sublists are long but there aren't very many of them, but keep in mind that the memory usage of a second list is just for the *references* to the objects in the sublists, no copy of the objects is made, so memory would have to be *really* tight. – agf Nov 30 '11 at 18:00
  • @user1040563 using `res = reduce (list.__add__, listofitems)` is safe regarding the error you had. – mh-firouzjah Sep 28 '21 at 12:41