I have several large lists. I would like to combine them into a one-denominational list. For example,
small_lists = [[{'value1':1}]*100000,[{'value2':2}]*100000,[{'value3':3}]*100000]
combined_list = []
for small_list in small_lists:
combined_list.extend(small_list)
Is there a faster way than above?
numpy is suggested in several answers but it seems significant slower for me. Am I doing anything wrong?
import time
import numpy as np
small_lists = [[{'value1':1}]*10000000,[{'value2':2}]*10000000,[{'value3':3}]*10000000]
start = time.time()
np_list = np.array(small_lists).flatten()
print("{} sec".format(time.time() - start))
print(len(np_list))
start = time.time()
combined_list = []
for small_list in small_lists:
combined_list.extend(small_list)
print("{} sec".format(time.time() - start))
print(len(combined_list))
from functools import reduce
start = time.time()
reduce_list = reduce(lambda x, y: x+y, small_lists)
print("{} sec".format(time.time() - start))
print(len(reduce_list))
The output is 2.01335906982 sec for numpy, 0.113998889923 sec for extend, 0.299326896667 sec for reduce. extend is by far the fastest.