Let's say I have a time series represented in a numpy array, where every 3 seconds, I get a data point. It looks something like this (but with many more data points):
z = np.array([1, 2, 1, 2.2, 3, 4.4, 1, 1.2, 2, 3, 2.1, 1.2, 5, 0.5])
I want to find a threshold where, on average, every y
seconds a data point will surpass that threshold (x
).
Maybe my question would be easier to understand in this sense: let's say I've gathered some data on how many ants are leaving their mound every 3 seconds. Using this data, I want to create a threshold (x
) so that in the future if the number of ants leaving at one time exceeds x
, my beeper will go off. Now this is the key part - I want my beeper to go off roughly every 4 seconds. I'd like to use Python to figure out what x
should be given some y
amount of time based on an array of data I've already collected.
Is there a way to do this in Python?