I'm working on a project right now, and at some point I am dealing with an nparray of dimensions (165L, 653L, 1024L, 1L). (Around 100MB worth of data).
For JSON compatibility reasons, I need to turn it into a regular list. So I used the regular function
array.tolist()
The problem is that this line results in 10GB worth of RAM consumption. Something seems wrong here, am I not supposed to use tolist() on big arrays ?
I have looked around the web for a bit, I have found some suspicions of tolist() leaking memory, notably here Apparent memory leak with numpy tolist() in long running process and here https://mail.python.org/pipermail/matrix-sig/1998-October/002368.html . But this doesn't seem directly related to my problem.