0

I have code that reads in a sensor data file & outputs a series of spectrograms augmented with added noise. The problem I'm running into is that memory usage is increasing on every iteration of the spectrogram generation loop, until eventually I run out of memory.

In the picture below you can clearly see the memory usage per iteration & the small residual build up over time after each iteration. The 1st panel is the 1st 6~7 iterations, the 2nd panel was ~60th iteration, 3rd panel ~160th iteration.

enter image description here

The code,

import numpy as np
import matplotlib.pyplot as plt
import os

f = open("datafile_1685530800.txt", "r")
x = []
y = []
z = []

# Reads in data and stores in appropriate array
for idx, line in enumerate(f):
    
    if idx > 1:
        datum = [float(x) for x in line.split(",")]
        x.append(datum[2])
        y.append(datum[3])
        z.append(datum[4])
           
x = np.array(x)
y = np.array(y)
z = np.array(z)

x_std = np.std(x)
x_len = len(x)
y_std = np.std(y)
y_len = len(y)
z_std = np.std(z)
z_len = len(z)


# Add random noise from a Gaussian distribution centred at 0 within a std of each data series 
# & generates a spectrogram
for idx in range(270):
    
    print(idx)
    
    x_r = list(x + np.random.normal(0, x_std, x_len))
    y_r = list(y + np.random.normal(0, y_std, y_len))
    z_r = list(z + np.random.normal(0, z_std, z_len))
    
    # For x axis
    os.chdir(r'X/x_norm')
    fig = plt.figure()
    ax  = plt.subplot(111)
    _, _, _, im = ax.specgram(x_r)
    ax.axis('off')
    fig.tight_layout()
    fig_name = "x_" + str(idx) + ".png"
    fig.savefig(fig_name, bbox_inches = 'tight', pad_inches = 0)
    plt.close()

    # For y axis
    os.chdir(r'../../Y/y_norm')
    fig = plt.figure()
    ax  = plt.subplot(111)
    _, _, _, im = ax.specgram(y_r)
    ax.axis('off')
    fig.tight_layout()
    fig_name = "y_" + str(idx) + ".png"
    fig.savefig(fig_name, bbox_inches = 'tight', pad_inches = 0)
    plt.close()
    
    # For z axis
    os.chdir(r'../../Z/z_norm')
    fig = plt.figure()
    ax  = plt.subplot(111)
    _, _, _, im = ax.specgram(z_r)
    ax.axis('off')
    fig.tight_layout()
    fig_name = "z_" + str(idx) + ".png"
    fig.savefig(fig_name, bbox_inches = 'tight', pad_inches = 0)
    plt.close()
    
    x_r = []
    y_r = []
    z_r = []
    
    os.chdir(r'../..')

I have reset the temporary lists at the end of spectrogram generation loop which has had an effect but not enough of one to stop the memory usage creep. Could anyone explain what is happening here & how to mitigate it?

DrBwts
  • 3,470
  • 6
  • 38
  • 62
  • 1
    i would guess that the pyplot is fig is not properly closed, does this https://stackoverflow.com/a/33343289/21165705 help? – lotus Aug 31 '23 at 11:57
  • @lotus thanks it turns out that if I use `plt.clf()` followed by `plt.close()` then `matplotlib`'s API collects the garbage properly but not if I just use on of them. – DrBwts Aug 31 '23 at 12:33

0 Answers0