I am trying to calculate the RMS value of a waveform but am running into some problems.
I take samples every x microseconds which is triggered by an interrupt. The sample is stored in an array and each time a sample is taken it pushes the last value to the next point in the array and feeds a new value in. As i take the sample I square it and divide by 20 (number of sample per period, assume waveform fixed frequency) then put it into my array, i also add it to a sum value and when i reach 20 samples i subtract the first sample made and add the last sample made.
value 20 = value 19 //INT16 values
value 19 = value 18
...
value1 = (sample * sample)/20
sumvalue += value1
sumvalue -= value20
I then call an RMS function which takes that value, divides by the last calculated RMS value (or if not calculated yet then divide by 1) add the last RMS value then divide all that by 2.
CalcRMS(sumvalue)
INT32 tempsum
if(RMS)
tempsum = (sumvalue/RMS + RMS)/2
else
tempsum = (sumvalue + 1)/2
RMS = tempsum
I then output RMS to the screen. Only problem is that my RMS value keeps changing, even though the waveform is constant. If i run a dc value in there my RMS stays steady but shove in a sine wave and it goes crazy.
Hoping somebody can point me in the right direction. I don't want the answer straight up, just some nudges to get me back on track.