Suppose that I have multiple hdf5 files in each different directory, which corresponds to different date.
Now I can use the following code to read each file and then extract something useful.
However, it seems that the for
loop to read each file using h5py.File
is not so fast.
Are there some methods to read multiple h5 files of a day or of all the days together?
for days in np.arange(len(date_needed)):
year = date_needed[days].year
dayofyear = date_needed[days].dayofyear
month = date_needed[days].month
day = date_needed[days].day
files = glob(path+'{}/{:0>3d}/data_{}{:0>2d}{:0>2d}_test.h5'.format(year,dayofyear,year%100,month,day))
if len(files)>=1:
for file_index in np.arange(len(files)):
data = h5py.File(files[file_index],'r')
...
...