1

I am using SLURM to run a deep learning framework. I am trying to integrate different data containers to this framework (hdf5, bcolz (ctable) and zarr).

When running the framework using ctable as a data structure I got an error

"slurmstepd: error: Exceeded step memory limit at some point".

I think the problem is that when reading a ctable, there is not a function ctable.close() that aims to save memory.

Does anyone have an idea how to free memory in this case ?

I tried self.free_cachemem() but it doesn't help.

Thank you.

arun v
  • 852
  • 7
  • 19

0 Answers0