0

I want to separate a large file with millions of lines into files with 1000 lines each.

I'm thinking of iterating in each line of the large file and writing each line into a new file until it is 1000 lines and then creating another file until it is 1000 lines and so on, im a newbie here and im having hard time putting it in python code.

Thanks in advance for the help.

Just an additional info, i'm using python 2.7.5 in win7

  • Iterate over the input file wrapped in the grouper from the dupe question, open a new outputfile for each chunk generated. – Martijn Pieters May 29 '14 at 17:01
  • `for i, group in enumerate(grouper(openinputfile)):`, `with open('outputfilename_{}'.format(i), 'w') as outfh:`, `for line in group: outfh.write(line)` should do it. – Martijn Pieters May 29 '14 at 17:04
  • Note that if you're working on a linux system, you can use the `split` command to do this (`split --lines 1000 bigfile.txt newfileprefix.` or something.) – DSM May 29 '14 at 17:06

0 Answers0