I have 11 years (2007 to 2017) daily files of temperature. There are a total of 11*365 = 4015
NetCDF files. Each file contains latitude (100,)
, longitude (360,)
dimensions and a temperature variable of these with size (360, 100)
. I want to find the 15 days running (moving) average at each grid point ignoring the NaN
values if present. That means 15 files need to be used to find the mean. I have the following function to read all the daily files from a folder. e.g. mean of files_list[0:15]
, files_list[1:16]
, files_list[2:17]....
, files_list[4000:]
need to be found. And each file mean need to be saved as a new NetCDF file. I have an idea of creating a NetCDF file. But could not find the running or moving average.
Here is my code :
def files_list (working_dir, extension):
'''
input = working directory and extension of file(eg. *.nc)
outout = returns the list of files in the folder
'''
file_full_path = os.path.join(working_dir)
os.chdir(working_dir)
files = glob.glob(os.path.join(file_full_path,extension))
files = natsort.natsorted(files)
files_list= [] #Empty lsit of files
j = 0
for j in range(0,len(files)):
files_list.append(os.path.basename(files[j])) #appending each files in a directory to file list
return files_list