-1

Disk file size of 600+ MB continuously appended by multiple agents.

Below code open and reads this large size file,

def alert_entries():
    with open('large_size_file') as f:
         ....
          ....
        for entry in entry_re.findall(f.read()):
            yield entry

Machine goes slow after opening this file.


How to manage opening and reading large size file, memory efficiently?

overexchange
  • 15,768
  • 30
  • 152
  • 347

1 Answers1

2

Calling f.read() will read the whole thing in memory. You can either iterate over the file directly for line in f (as mentioned in the comment below) or pass an argument to read e.g. read(size) to limit how much you process in one go.

Refer to this post for an example using the latter approach.

Solaxun
  • 2,732
  • 1
  • 22
  • 41