If you really need to scan one giant (e.g. 1TB) single-line file and process items by delimiters, you can read file by blocks, split them, and handle border-effects. Here a generator which may help with it:
def split_file(file, delim, block_size=1024*1024):
block = True
last_item = ''
while block:
block = file.read(block_size)
items = block.split(delim)
for i in xrange(len(items)-1):
item = items[i]
if last_item:
yield last_item + item
last_item = ''
continue
if item:
yield item
last_item += items[-1]
You can simply use it like this:
f = open("names.in.txt")
for name in split_file(f, ","):
print name # process one item there