I've got 1.6GB available to use in a python process. I'm writing a large csv file which data is coming from a database. The problem is: After the file is written, the memory (>1.5GB) is not released immediately which causes an error in the next bit of code (allocating memory fails because the OS cannot find enough memory to allocate).
Does any function exists which would help me release that memory? Or, do you have a better way to do it?
This is the script I'm using to write the file, is writing by chunks to deal with the memory issue:
size_to_read = 20000
sqlData = rs_cursor.fetchmany(size_to_read)
c = csv.writer(open(fname_location, "wb"))
c.writerow(headers)
print("- Generating file %s ..." % out_fname)
while sqlData:
for row in sqlData:
c.writerow(row)
sqlData = rs_cursor.fetchmany(size_to_read)