I have a class which has methods to save and read CSV or .msgpack files. These files are huge(~3GB) and thus uses a lot of memory.
I am creating an object of this HugeMemoryEatingClass and then i delete this object before creating another object of the same class to read another file.
csvObj = HugeMemoryEatingClass("csv", fileName)
#read/save file, uses a lot of memory
del csvObj
msgObj = HugeMemoryEatingClass("msg", fileName)
msgObj.read() #results into MemoryError
The msgObj is created even before the csvObj has been fully deleted and the memory freed up. Thus when i call the msgObj.read() method, it results into a memory error.
I thought deleting the objects will free up the memory.
Is there any way I can free up the memory before python goes to create the msgObj ?