In Python 2.6 is there a more efficient way of searching a file line by line (for a string) and after finding it, inserting some lines into that file? So the output file would just be the same as the input file with a few lines added in between. Also, I'd rather not read these files into a buffer because the files can be very large.
Right now, I'm reading a file line by line and writing it into a temp file until I find the line I'm looking for and then inserting the extra data in the temp file. And write the rest of the data into the temp file. After I'm done processing the file, overwrite the old file with the new temp file. Something like this:
with open(file_in_read, 'r') as inFile:
if os.path.exists(file_in_write):
os.remove(file_in_write)
with open(file_in_write, 'a') as outFile:
for line in inFile:
if re.search((r'<search_string',line):
write_some_data(outFile)
outFile.write(line)
else:
outFile.write(line)
os.rename(src,dst)
I was just wondering if I can speed it up somehow.