Possible Duplicate:
Is there a generator version ofstring.split()
in Python?
str.split(delim)
splits a string into a list of tokens, separated by delim
. The entire list of tokens is returned in one hit.
When dealing with large blocks of text, it might be advantageous to process tokens lazily. That is, only get one token at a time, as needed. (The example that springs to mind is processing a large chunk of text in memory.)
Is there a builtin or a standard library function that will perform a lazy split()
? Something from itertools
?