2

I'm trying to read a file in smallish chunks with Python to get around the issue of having < 1G of memory to play with. I'm able to write the file to disk and read in chunks, but no matter what I try I always end up getting a MemoryError. I originally didn't have the del/gc stuff, but put that in after reading a bit online.

Can anyone help point me in the right direction in a way to read this file in chunks (256M-512M) and dump the chunk out of memory as soon as it's done and before loading the next one?

with open(path) as in_file:
    current = 0
    total = os.stat(path).st_size
    while current < total:
        in_file.seek(current, 0)
        bytes_read = in_file.read(byte_count)
        # do other things with the bytes here
        in_file.close()
        del in_file
        gc.collect()
        current += byte_count
martineau
  • 119,623
  • 25
  • 170
  • 301
Matt Klaver
  • 169
  • 1
  • 10
  • It's difficult for anyone to help since most folks have much more memory than that and it's unclear how to measure memory usage. That said, see the answers to the question [Read file in chunks - RAM-usage, reading strings from binary files](https://stackoverflow.com/questions/17056382/read-file-in-chunks-ram-usage-reading-strings-from-binary-files). – martineau Nov 02 '21 at 00:48

0 Answers0