-1

How to extract for more file (theres like 100 files)

pw_201701010000.nc 
pw_201701010100.nc 
pw_201701010200.nc 
pw_201701010300.nc 
... 
pw_201701022300.nc

Code:

from numpy import *
from netCDF4 import Dataset

fname = 'pw_201701010000'

f = Dataset(fname+'.nc')
print (f.variables)
data = f.variables['pw'][0][:]
data = data-0.
time = f.variables['time'][:]
lat = f.variables['latitude'][:]
lon = f.variables['longitude'][:]
f.close()


f = open(fname+'.txt','w')
for i in range(len(lat)):
    for j in range(len(lon)):
        if lat[i]>=-3.0 and lat[i]<=0 and lon[j]>=116.0 and lon[j]<=119.0:
            #f.write(('%f\t%f\t%f\n')%(lat[i],lon[j],data[i,j]))
            f.write(('%f\t')%(data[i,j]))
    if lat[i]>=-3.0 and lat[i]<=0:
        f.write('\n')
f.close()
Jaap
  • 81,064
  • 34
  • 182
  • 193
hananoorr
  • 29
  • 2

1 Answers1

1

It's quite simple to do if you use glob to get all the .nc files:

import glob
from numpy import *
from netCDF4 import Dataset

for fname in glob.iglob("*.nc"):

    f = Dataset(fname)
    print(f.variables)
    data = f.variables['pw'][0][:]
    data = data-0.
    time = f.variables['time'][:]
    lat = f.variables['latitude'][:]
    lon = f.variables['longitude'][:]
    f.close()

    with open(fname.rsplit('.',1)[0]+'.txt','w') as f:
        for i in range(len(lat)):
            for j in range(len(lon)):
                if lat[i]>=-3.0 and lat[i]<=0 and lon[j]>=116.0 and lon[j]<=119.0:
                    #f.write(('%f\t%f\t%f\n')%(lat[i],lon[j],data[i,j]))
                    f.write(('%f\t')%(data[i,j]))
            if lat[i]>=-3.0 and lat[i]<=0:
                f.write('\n')
inspectorG4dget
  • 110,290
  • 27
  • 149
  • 241
  • +1, but I wouldn't use glob. If there is 100 *.nc files then how many others will be there. If dir is big it would be better to use generator version of listdir() (glob module) uses os.listdir()) then check: if os.path.isfile(fname) and fname.endswith(".nc"). Then we wouldn't need to wait for glob.glob() to finish before starting our extraction. – Dalen Jan 03 '17 at 18:13
  • @Dalen: thanks for the tip. I've updated – inspectorG4dget Jan 03 '17 at 18:36
  • A little better. But iglob() also uses os.listdir() which can block until it gets all files from a directory. If there are a lot of them a delay can be pretty big. What I meant is that you should write iglob()-like function, but use some other listdir() that doesn't return a list but generator object instead. Thus you do not wait for os.listdir() then iterate over returned list once more, but you check the files as you go over the directory file by file. – Dalen Jan 04 '17 at 07:35
  • Something like this: http://stackoverflow.com/questions/31426536/how-to-copy-first-100-files-from-a-directory-of-thousands-of-files-using-python On Windows you can use wildcards with this code directly: listdir("drive:\\dirpath\\*.nc") – Dalen Jan 04 '17 at 08:27