4

Possible Duplicate:
Is there a generator version of string.split() in Python?

str.split(delim) splits a string into a list of tokens, separated by delim. The entire list of tokens is returned in one hit.

When dealing with large blocks of text, it might be advantageous to process tokens lazily. That is, only get one token at a time, as needed. (The example that springs to mind is processing a large chunk of text in memory.)

Is there a builtin or a standard library function that will perform a lazy split()? Something from itertools?

Community
  • 1
  • 1
Li-aung Yip
  • 12,320
  • 5
  • 34
  • 49
  • I have just posted a new answer in the duplicate which was not there before, since I think you can use `re.finditer()`, which would not consume any extra memory http://stackoverflow.com/a/9770397/711085 – ninjagecko Mar 19 '12 at 12:44

1 Answers1

3

Not exact equivalent, but re.finditer() searches string lazily.

hamstergene
  • 24,039
  • 5
  • 57
  • 72