I have files which contains atmospheric data in 6 hrs intervals (4 files per day) in .grd format. I also have the concerned descriptor files (yyyymmddhh.ctl) . I can plot the data using using GrADS. But I need to convert these files into NetCDF format (.nc) to visualize it with the help of ferrert . Does any one know how to do it?
3 Answers
I would strongly suggest you to use CDO to convert files to NetCDF. When doing simple manipulation of .nc files, using CDO or NCO is almost always the best option. In my experience, CDO, when the right operator is available, is in general safer and much faster than using Python or R.
e.g.:
cdo -f nc import_binary in_grads.ctl out_ncdf.nc
You can find more on this on some topics in the CDO forums:
https://code.zmaw.de/boards/1/topics/1031
https://code.zmaw.de/boards/1/topics/213
PS: CDO hint: when chaining multiple CDO operators, use the -L option to avoid segfaults, and consider using virtual RAM space (/dev/shm on most Linux distros) for temporary files to avoid disk writes.

- 3,160
- 28
- 63
-
Thank you so much for your concern. It helped me a lot. You saved the day...:) – Deep S Banerjee Sep 18 '15 at 06:46
You definitely should use the Climate Data Operators (CDO) to do something like:
cdo -f nc import_binary in.ctl out.nc
If you have a lot of files to process you might want to write a script to process them.
For example, I had a tar file with a bunch of GrADS files from the COAMPS model that when unpacked resulted in 121 pairs of .dat
and .cdl
files, with names like:
COTC.18L.2012102512.000.ctl
COTC.18L.2012102512.000.dat
COTC.18L.2012102512.001.ctl
COTC.18L.2012102512.001.dat
COTC.18L.2012102512.002.ctl
COTC.18L.2012102512.002.dat
...
so I wrote a small bash script:
#!/bin/bash
for file in *.ctl
do
fname=${file%.ctl}
cdo -f nc import_binary ${fname}.ctl ${fname}.nc
echo ${fname}.nc
done
to convert them all to netcdf.
Note #1: CDO can be tricky to build, but you can install it with Conda if you are on Linux or Mac.
Install Miniconda (free) if don't already have Conda. Here's how:
Step 1. Make sure that you have a ~.condarc
and it looks like this:
$ more ~/.condarc
channels:
- conda-forge
- defaults
Step 2. Create a custom CDO environment to run CDO:
$ conda create --yes -n CDO python=3.6 cdo
$ source activate CDO
Step 3. Run your nco
commands!
Note #2: I also created an NcML
file to virtually aggregate these data on my thredds data server. That file looked like this:
<netcdf xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2">
<aggregation dimName="time" type="joinExisting">
<scan location="." regExp=".*COTC\.18L\.[0-9]{10}\.[0-9]{3}\.nc$"/>
</aggregation>
</netcdf>
see https://gis.stackexchange.com/questions/70919/setting-up-thredds-catalogs-for-ocean-model-data for more info on setting up the THREDDS Data Server to handle this.

- 1
- 1

- 14,842
- 4
- 49
- 77
-
Thanks for very useful answer. While i type this commander to convert the binary file, one error occured that unknown keyword(pdef) in description file. How can to solve it? – Li Ziming Feb 07 '18 at 00:28
You can write out netcdf data with GrADS using the sdfwrite command. Documentation is at http://iges.org/grads/gadoc/gradcomdsdfwrite.html

- 26
- 3