Python 2.7 wxPython 3.0. numpy 1.9
I have been running an application that on Windows machine takes 36K RAM, and this is how much I would expect. I have a big array of data stored in RAM. However, when I tried to run it on a Linux machine in Python IDLE the memory usage is very high, and applications are not properly cleared out of RAM after I closed the application. Gnome-system-monitor was showing ~ 10 identical applications open, and one of them was taking ~1.5 GB of memory. Interestingly, this number kept slowly rising ~ 100MB per hour.
Update: Similar behaviour was observed on a Windows 7 machine but with much slower memory usage increase.
The program has 36MB numpy array of 2-byte integers. I process them and plot with wxPython using wxagg
from matplotlib.backends.backend_wxagg import FigureCanvasWxAgg as FCW
with time the memory consumption goes up. I am calculating y and x coordinates every time I plot because this is a real-time data collection and plotting (like an oscilloscope). I do not plot entire buffer. I average N points to get ~ 360 points per entire time range.
Have anyone seen anything similar? What are thoughts on this?
Python version 2.7