0

Handling a list product of 100x100 size is fine in python:

>>> import itertools
>>> import numpy as numpy
>>> nested_loop_iter = itertools.product(range(100), range(100))
>>> probs = np.fromiter(map(lambda x: x[0] *x[1], nested_loop_iter), dtype=int)
>>> probs
array([   0,    0,    0, ..., 9603, 9702, 9801])

But when the size of the list product grows to 100,000 x 100,000, it throws an IndexError:

>>> import itertools
>>> import numpy as numpy
>>> nested_loop_iter = itertools.product(range(100000), range(100000))
>>> probs = np.fromiter(map(lambda x: x[0] *x[1], nested_loop_iter), dtype=int)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
IndexError: list assignment index out of range

Can Python handle such a huge list product? It sums up to 10,000,000,000 elements in the resulting list.

According to this: How Big can a Python Array Get? , Python should be able to handle a list of 10,000,000,000, but why is it still throwing an IndexError?

Community
  • 1
  • 1
alvas
  • 115,346
  • 109
  • 446
  • 738
  • My calculation results in 10,000,000,000 items. Which for integers would require about 75GiB of memory. Maybe the error is a side effect of an underlying memory error. – Klaus D. Nov 21 '16 at 07:14
  • I have 250GB on my machine. But i'm not sure whether Python can access them. – alvas Nov 21 '16 at 07:17
  • i tried 10,000 * 100,000 and it works fine, let me see whether i can save 10 of those to reach 10,000,000,000. – alvas Nov 21 '16 at 07:19
  • Possible that out of ram ? Because probs array need to be build upon runtime – Skycc Nov 21 '16 at 10:44

0 Answers0