I'm trying to set up a filter in Python, and I am stuck in a basic stuff.
This is the function that I have to implement:
y[n] = (1 − λ) · x[n] + λ · y[n − 1]
I'm reading an m file and applying the filter.
Here's the code:
import sys
import numpy as np
import matplotlib.pyplot as plt
y = np.loadtxt('acs712_192us.m')
size = len(y)
x = np.arange(0, size)
out = []
lamb = 0.9
for i in range(0, len(y)):
out.append(((1-lamb) * y[i]) + (lamb * out[i - 1]))
plt.plot(x, y)
plt.plot(x, out)
plt.show()
When I run this program, get the following error:
File "main_LI.py", line 15, in out.append(((1-lamb) * y[i]) + (lamb * out[i - 1])) IndexError: list index out of range
I know i can't access out[-1] position (in the first loop), maybe is this the problem?
Anyone have any tips?
Thank you guys!