I have time-series data. 7200 rows(data point) and two columns (one time and one frequency data). I want to calculate the average of the frequencies for every 120 rows. I wrote the following code. I know it is very stupid, but I don't want to repeat the following code for 60 times. Is there any way in which I can do it automatically or shorter?
Thank you for your help
window1 = df1[['p1', 'p2']].iloc[0:119].mean(axis=0)
window2 = df1[['p1', 'p2']].iloc[120:239].mean(axis=0)