I am reading in a chunk of data from a pytables.Table
(version 3.1.1) using the read_where
method from a big hdf5 file. The resulting numpy array has about 420 MB, however the memory consumption of my python process has gone up by 1.6GB during the read_where
call and the memory is not released after the call is finished. Even deleting the array, closing the file and deleting the hdf5 file handle does not free the memory.
How can I free this memory again?