So I have an array of say 25 samples and I would want to be able to note the trends of whether it's decreasing n or increasing from those 25 sample time interval(basically 25 samples array is my buffer that is being filled by every say 1 ms).
Note that it is general trend that I am looking for, not the individual derivative(as I would have obtained using finite difference or other numerical differentiation techniques).
Basically I expect my data to be noisy so there might be ups and downs even after doing filtering and so on. But it's the general trend of increasing or decreasing behaviour that I am looking for.
I want to integrate the increasing/decreasing behaviour in every ms to trigger some event which is more of a user interface event (blinking a LED) so it does not have to very delay of processing as long as I can detect the general trend.
Thanks in advance!