Using Python, assume I'm running through a known quantity of items I
, and have the ability to time how long it takes to process each one t
, as well as a running total of time spent processing T
and the number of items processed so far c
. I'm currently calculating the average on the fly A = T / c
but this can be skewed by say a single item taking an extraordinarily long time to process (a few seconds compared to a few milliseconds).
I would like to show a running Standard Deviation. How can I do this without keeping a record of each t
?