-3

I am working with text files and looping over them, python works well with files of 10k to 20k lines, most of them are of that length, few text files are over 100k lines, where the code just stops or just keeps buffering, how can we improve the speed or open the text file directly, even if there has to be any iteration, it should be pretty quick, and I want my text file opened in a string format, so no readlines.

Manas Jadhav
  • 91
  • 1
  • 4
  • 1
    Why did you tag this XML, thereby bringing the post to the attention of thousands of people who aren't able to help you? – Michael Kay Nov 16 '22 at 09:20

1 Answers1

-1

I'm confused as to your "without iteration" parameter. You're already looping over the files and using a simple open or some other method, so what is it that you're wanting to change? As you didn't post your code at all there's nothing to work from to understand what might be happening or to suggest changes to.

Also, it sounds like you're hitting system limits, not a limit of python itself. For a question like this it would be worthwhile to give your system parameters alongside the code so that someone can get a full picture when responding.

Typically I just do something similar to the good ol' standby that won't destroy your memory:

fhand = open('file.extension')
for line in fhand:
    # do the thing you need to do with each line

You can see a more detailed explanation here or in Dr Chuck's handy free textbook under the "Files" section.

Naeblis
  • 54
  • 8