I came across two different ways to split an iterable
into "chunks" (more than 1 item).
from itertools import izip_longest
def grouper(iterable, n, fillvalue=None):
args = [iter(iterable)] * n
return izip_longest(*args, fillvalue=fillvalue)
the other method is straight python:
def chunker(seq, size):
return (seq[pos:pos + size] for pos in xrange(0, len(seq), size))
Does the itertools implementation buy you anything "extra"?
Where "extra" would be, maybe faster or more flexible or safer.
I ask because the itertools implementation shown here is definitely NOT more readable/intuitive IMO.