How to obtain/plot/analyze data

Submitted by Cathy.Smith@noaa.gov on Thu, 10/07/2010 - 14:48

 

Data Extraction

  1. NCDC NOMADS and NCMP:  reanalysis (CFSR,NARR,R1,R2); NWP (NAM, GFS, RUC); GENS ensembles, SST
  2. NOAA/ESRL Search and Plot. (R1,R2,20CR, NARR)
  3. IRI (CFSR,20CR,R1,R2)
  4. ECMWF ERA-40, ERA-Interim (Get netCDF or GRIB files from the daily or subdaily data)
  5. ERA-20C 
  6. Goddard Earth Science Model Data Information Sevices Center (MERRA)
  7. NCAR, Highest-resolution files for all reanalyses, except MERRA.  GRIB parameter field extraction using cURL, and some conversion to netCDF as noted. 
    1. JRA-25: Data Access > Web File Listing > Create Your Own File List (e.g. anl_p), use cURL or wget (for files).
    2. JRA-55 - same as JRA-25
    3. ERA-Interim - same as JRA-25.
    4. CFSR - same as JRA-25 (also subset with net CDF format conversion)
    5. 20CR - same as JRA-25
    6. 20CRv2c
  8. MERRA Data Subsetter  (variable list and what file to look at for a particular variable)
  9. openDAP servers: NOAA/ESRL (20CR,R1,R2,NARR), NCEP, MERRA2D, MERRA3D
  10. OpenDAP: MERRA Gridded Innovations and Observations (GIO)
  11. OpenGrADS.org: GrADS software with additional user functionality, including GUI for reanalyses including NCEP and MERRA
  12.  Earth System Grid Federation (CFSR, MERRA, 20CR, JRA-25, ERA-Interim) Ana4MIPS: Easy access to netCDF reanalyses data of selected variables corresponding to the CMIP5 climate model output
  13. GOAT: Geophysical Observations Analysis Tool for Matlab

Step-by-Step Guide to obtaining data files

    Step by step descriptions of how to get the data at the reanalyses centers are available.

Post Processing Routines and Algorithms

  1. Extrapolate MERRA pressure-level data below the surface

Webtools to plot/analyze data by Function

Basic Maps

  1. IRI (CFSR,20CR,R1,R2)
  2. ESRL/PSD Search and Plot (20CR,R1,R2,NARR)
  3. MERRA: Uses Giovanni to produce maps or animations of some monthly fields. Can average over successive times.
  4. NOMADS (CFSR,NARR,R1,R2)
  5. ECMWF ERA-40, ERA-Interim (Plot maps)
  6. MERRA Atlas
  7. Web-based Reanalysis Intercomparison Tool for maps makes user-selected reanalysis fields for monthly data. It can also difference two reanalyses or selected observational datasets with user-selected climatologies.
  8. The Climate Reanalyzer makes user-selected reanalysis fields and differences for monthly data.
  9. MeteoCentre provides pre-generated synoptic maps of SLP, 1000-500 thickness, and 500 hPa height (20CR and R1).
  10. GOAT: Geophysical Observations Analysis Tool for Matlab

Other Basic Geographic Plots

  1. ESRL: Crossections
  2. Search and Plot
  3. Merra: Uses Giovanni to produce hovmollers of some monthly fields.
  4. KNMI.
  5. GOAT: Geophysical Observations Analysis Tool for Matlab

Hovmollers

  1. ESRL/PSD: Hovmollers (R1)
  2. IRI
  3. GOAT: Geophysical Observations Analysis Tool for Matlab

Advanced Plots

  1. ESRL/PSD: Can plot monthly, daily and sub-daily of crossections from composite plotting pages (R1). Anomalies are available.
  2. ESRL/PSD: Hovmollers of means, anomalies of daily data. Anomalies are available (R1).
  3. GOAT: Temporal and spatial subsettting is supported via a GUI or built in function. Built-in calculation of anomalies and climatology. Superimpose or show the difference between two fields. Display land-cover or topography.

Composite Maps (Average different, possibly non contiguous dates together)

  1. ESRL/PSD: Can plot composite maps and vertical crossectons from composite plotting pages on monthly, daily and sub-daily time scales for R1. Anomalies are available.
  2. ESRL/PSD: Can plot composite maps from plotting pages on monthly, daily and sub-daily timescales for 20CR. Anomalies are available. For monthly, plot composite maps of the 20CR ensemble spread (uncertainty).
  3. GCOS/WGSP: Can plot composite maps of sea level pressure from plotting pages on monthly timescales. Anomalies are available.
  4. WRIT maps: Can plot composite maps of a reanalysis or the difference of composites from two reanalyses.
  5. GOAT: Can plot composites of non-contiguous dates. 

Correlation Maps

  1. ESRL/PSD: From monthly NCEP/NCAR R1
  2. KNMI
  3. The Climate Reanalyzer (ERA-Interim, NCEP/NCAR R1, NCEP/DOE R2, 20CR, observational datasets: PRISM precipitation, ERSSTv3b)

Timeseries Plots

  1. NOMADS
  2. TDS
  3. NOAA/ESRL PSD: Plot simple timeseries of NCEP/NCAR R1 and 20thC Reanalysis monthly variables
  4. IRI
  5. Web-based Reanalysis Intercomparison Tool for timeseries (WRIT) makes user-selected reanalysis timeseries, scatter plots, cross-correlation functions, and probability density functions for monthly data. It can also difference two reanalyses or selected observational datasets.
  6. The Climate Reanalyzer makes user-selected reanalysis time series with land/ocean masking.

Timeseries Analysis

  1. KNMI: Plot, compute annual cycle, filter and other tools available for time series analysis. Provides many climate/ocean timeseries.
  2. ESRL/PSD: Correlate and do some other simple analysis on pregenerated or user supplied monthly timeseries
  3. ESRL/PSD: Extract daily timeseries from datasets. Can supply user criteria (e.g. top 10 temperatures in January for a point). R1,20CR
  4. ESRL/PSD: Extract monthly timeseries from datasets.  R1,20CR
  5. IRI
  6. Web-based Reanalysis Intercomparison Tool for timeseries (WRIT) makes user-selected reanalysis timeseries, scatter plots, cross-correlation functions, and probability density functions for monthly data. It can also difference two reanalyses or selected observational datasets.

Google Earth

  1. NOAA/ESRL PSD: Create in browser Google Earth plots (20CR, R1)

Miscellaneous

  1. ESRL/PSD Lead/Lag for Composites
  2. ESRL/PSD Lead/Lag for Correlations (maps)
  3. KNMI Smoothed fields, EOF, SVD and other analysis
  4. U. of Miami extreme event finder (20CR)
  5. WRIT Trajectory calculator (20CR, CFSR)

 

Applications that read/plot/analyze netCDF and/or grib data (non-web)

A complete list is Unidata

  1. NCL
  2. GrADS
  3. IDV
  4. Ferret
  5. NCO netCDF Operators
    Extract data from netCDF files, change attributes, combine files....
  6. CDO
  7. MATLAB
  8. IDL
  9. CDAT
  10. GOAT. A MATLAB based tool that integrates with NetCDF files and OPeNDAP sources.

 

 

 

gilbert.p.comp…

Wed, 01/25/2012 - 10:14

Stefan,

Adding panoply is a great idea, but Reanalyses.org is a wiki site that depends on users. You can login and add it where you feel it is appropriate. If you have any questions, please feel free to ask or add a question to the Help section.

best wishes,
gil compo

Easwar (not verified)

Thu, 05/30/2013 - 04:08

Dear sir, I need historiacl /longterm wind data for a specific site in order to obtain correlation with actual data/nearby metmast data,so how can i get it ?and where from?.Kindly guide me with a procedure to download the data with an exact link. Regards, Easwar.

Dear Easwar, Happy to help, but this area is for reanalysis data. See http://reanalyses.org/ for the definition of reanalysis data. This may be what you need but your question is not clear in this respect. If you want data from a station, you should post your question in the Observations area http://reanalyses.org/observations/surface . Is your site over the ocean or over land? How close do the data need to be to your site? What is your site location? While posting at http://reanalyses.org/observations/surface, you may want to make your question a bit clearer. What do you mean by "historical/longterm" wind data? Do you want a long-term average or do you want a long time series at some temporal resolution? What is the temporal resolution you need? What is the temporal resolution you can still use (e.g., monthly averages, daily averages, once-per-day)? What is the height of the data you need? Do you want data from satellites, such as scatterometers? By providing more information in the Observations area, someone may be able to help you better. best wishes, gil compo

Luigi (not verified)

Wed, 06/12/2013 - 12:59

Dear reanalyses.org I am trying to get daily weather data from CFSR to run an ecosystem model for a geographic area (say Italy) by using the NCDC OPENDAP server, e.g. http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cmd_flxf/2000/200005/20000504/flxf01.gdas.2000050400.grb2.html but with no luck so far. I was wondering whether there is a more direct way to get daily time series data (in ASCII) from CFSR that people uses routinely. Daily time series for surface parameters such as max/min temperature, solar radiation, precipitation, relative humidity, and wind, are standard for ecosystem models as life works on a circadian rhythm on Earth. Thanks for any hint and kind regards, Luigi

samudralav59 (not verified)

Fri, 12/27/2013 - 11:40

My present work of study of warming regimes and the trends require me acquire and capture data and analysis tools on open domain vis-à-vis ozone profiling, the ozone mixing ration and partial pressures.I would be very grateful if I could be given a peek to get the above in the most reliable free sources. thanking you. Samudrlav59

gilbert.p.comp…

Fri, 12/27/2013 - 13:23

Dear samudraval59, Some atmospheric reanalyses, such as NCEP/NCAR http://reanalyses.org/atmosphere/overview-current-reanalyses#NCEP1 do not provide ozone. some, such as CFSR http://reanalyses.org/atmosphere/overview-current-reanalyses#CFSR, ERA-Interim http://reanalyses.org/atmosphere/overview-current-reanalyses#ERAINT, MERRA http://reanalyses.org/atmosphere/overview-current-reanalyses#MERRA provide ozone on levels. Links to the data are provided at each overview. 20th Century Reanalysis (20CR) http://reanalyses.org/atmosphere/overview-current-reanalyses#TWENT provides only the total column ozone. Note that while ozone is prognostic in 20CR, that system assimilates only surface pressure. Please read the linked references to determine what each system is doing and what data are being assimilated, particularly related to ozone. Links to various tools are given on this page where you left this comment, i.e., http://reanalyses.org/atmosphere/how-obtainplotanalyze-data and are also http://reanalyses.org/atmosphere/tools . If those do not include ozone, you may want to leave a comment on each page or use the contact on the respective linked sites. For the Web-based Reanalysis Intercomparison Tool, you can leave comments at https://reanalyses.org/atmosphere/web-based-reanalysis-intercomparison-tools-writ best wishes, gil compo

Masatomo Fujiwara (not verified)

Fri, 12/27/2013 - 18:45

I think you had better look at the original satellite ozone measurements for your purpose. The Stratospheric Processes and their Role in Climate (SPARC) project has been doing ozone measurement validation and evaluation for many years. Please go to http://www.sparc-climate.org/activities/ozone-profile-ii/ and contact with the activity leaders shown there, and/or check "Website for further information" at the end of the page (i.e., http://igaco-o3.fmi.fi/VDO/index.html). Actually, there are several choices for satellite ozone measurements, but the latest version SAGE data set may be most useful for you. For ozone in the reanalyses, I think we need validation and evaluation before using it for climate studies. The SPARC Reanslysis Intercomparison Project (S-RIP, http://s-rip.ees.hokudai.ac.jp/index.html) has this component. For your information, the following is my quick survey for the 9 reanalyses. Please confirm by yourself by checking the references. NCEP/NCAR & NCEP/DOE: (Kalnay et al., 1996; Kistler et al., 2001; Kanamitsu et al., 2002): - Zonally averaged seasonal climatological ozone used in the radiation computation - (In NCEP/DOE, the latitudinal orientation was reversed north to south) ERA-40: (Uppala et al., 2005; Dethof and Holm, 2004): - TOMS and SBUV ozone retrievals (not radiance) are assimilated (1978-). Ozonesondes not assimilated. - Ozone in the ECMWF model is described by a tracer transport equation including a parametrization of photochemical sources and sinks. - The ozone climatology is used in the radiation calculations of the forecast model. ERA-Interim: (Dee et al., 2011; Dragani, 2011): - TOMS, SBUV, GOME (1996-2002), MIPAS (2003-2004), SCIAMACHY (2003-), MLS (2008-), OMI (2008-) are assimilated. SAGE, HALOE, and POAM are not assimilated. – Ozone model and radiation calculations are basically the same as ERA-40. JRA-25: (Onogi et al., 2007): – Ozone observations are not assimilated directly. – Daily ozone distribution is prepared in advance using a CTM with “nudging” to the satellite total ozone measurements and provided to the forecast model (the radiative part). JRA-55 (Ebita et al., 2011): - similar to JRA-25 for 1979-; monthly climatology for -1978 MERRA: (Rienecker et al., 2011): – SBUV2 ozone (version 8 retrievals) is assimilated for Oct 1978–present. – The MERRA AGCM uses the analyzed ozone generated by the DAS. (cf. a climatology for aerosol) NCEP-CFSR: (Saha et al., 2010) – SBUV profiles and total ozone retrievals are assimilated (but not bias-adjusted; should not be used for trend detection) – Prognostic ozone with climatological production and destruction terms computed from 2D chemistry models (for radiation parameterization) 20CR: (Compo et al., 2011): – "A prognostic ozone scheme includes parametrizations of ozone production and destruction (Saha et al., 2010)."

Unfortunately, it is not straight forward to automate the download of ERA-Interim and ERA-40 fields. I do have automated routines for the conversion of ERA-40 and ERA-Interim to GOAT format but you need to download the NC files yourself. If you are interested in monthly means, some of these are available at the goat-geo.org site. I can add more upon request. GOAT does support automated download for NCEPI, NCEPII, 20CRenalysis, ORAS4, TRMM, CloudSatCalipso composite, ERSST, MERRA, and others.

Camiel Severijns (not verified)

Wed, 02/19/2014 - 03:07

To force an ensemble of ocean model experiments I would like to use (near) surface data from individual members of the 20CR. I downloaded a file 187501_sfcanl_mem01.tar which I assume contains the data I am looking for. However, the files in this tar-file are not in GRIB format. The first few bytes contain the string 'GFS SFC'. Can anyone tell me how this files are formatted?

Dear Camiel, For the GRIB formatted data from the NERSC Science Tape Gateway at http://portal.nersc.gov/archive/home/projects/incite11/www/20C_Reanalysis/everymember_full_analysis_fields you want the surface flux grib files "sflx", rather than the binary "sfcanl" files. E.g. for the first member of the 0 to 3 hour forecast 187501_sflxgrbens_fhr03_mem01.tar and the first member of the 3 to 6 hour forecast 187501_sflxgrbens_fhr06_mem01.tar These are the file types that most other groups have used. Alternatively, you may want to obtain only your variables of interest. If a variable that you need is not at http://portal.nersc.gov/archive/home/projects/incite11/www/20C_Reanalysis/everymember_grib_indi_fg_variables we can generate it if it is in the "sflx" files. Obtaining the individual variables you need, rather than the complete "sflxgrbens" file may save transfer time. Please reply if you one or the other of these solutions does not work for your purposes. best wishes, gil compo

Dear Gill, I have found the data files and tried to convert the grib files to netcdf using ncl_convert2nc (version 6.1.2). This tool stops with the following warnings (there are lots more of those preceeding) and two fatal errors: warning:./TMP.2m.1871.ens.grb->TMP_98_HTGL is missing ens: 54 it: 12/31/1871 (18:00) ft: 6 warning:./TMP.2m.1871.ens.grb->TMP_98_HTGL is missing ens: 55 it: 12/31/1870 (18:00) ft: 3 warning:./TMP.2m.1871.ens.grb->TMP_98_HTGL is missing ens: 55 it: 12/31/1871 (18:00) ft: 6 fatal:NclGRIB: Couldn't handle dimension information returned by grid decoding fatal:NclGRIB: Deleting reference to parameter because of decoding error Classic model NetCDF does not support string types, converting initial_time1 to a character array Dimension 'ncl_strlen_0' will be added Classic model NetCDF does not support string types, converting ensemble0_info to a character array Dimension 'ncl_strlen_1' will be added Do you know what might be the problem here? Thanks, Camiel

Camiel, I was able to use ncl_convert2nc 6.0.0 to convert the file sflxgrbens_fhr03_1871010100_mem01 (add .grib suffix) to netcdf. This file is contained in the tarfile accessed from http://portal.nersc.gov/archive/home/projects/incite11/www/20C_Reanalysis/everymember_full_analysis_fields/1871/187101_sflxgrbens_fhr03_mem01.tar Conversely, when I tried the file TMP.2m.1871.ens.grb accessed from http://portal.nersc.gov/archive/home/projects/incite11/www/20C_Reanalysis/everymember_grib_indi_fg_variables/TMP/TMP.2m.1871.ens.grb.tar I get ncl_convert2nc error messages very similar to yours. I access the TMP.2m.1871.ens.grb in python. I suspect that there is a bug in the ncl_convert2nc for very large files. You may want to use wgrib http://www.cpc.ncep.noaa.gov/products/wesley/wgrib.html to slice up the file into smaller pieces and see if that works. Alternatively, since the sflxgrb file does work with ncl_convert2nc, perhaps using those will be better? best wishes, gil

Camiel, You may want to see if you can enable the "large file support" in ncl_convert2nc. compo/test_ncl_convert2nc> ncl_convert2nc -h ncl_convert2nc inputFile(s) OPTIONS inputFile(s) name(s) of data file(s) [required] [valid types: GRIB1 GRIB2 HDF HDF-EOS netCDF shapefile] [-i input_directory] location of input file(s) [default: current directory] [-o output_directory] location of output file(s) [default: current directory] [-e extension] file type, defined by extension, to convert [example: grb] [-u time_name] name of the NCL-named time dimension to be UNLIMITED [-U new_time_name] if -u is specified: new name of UNLIMITED variable and dimension [-sed sed1[,...]] GRIB files only; set single element dimensions [default: none] choices are initial_time, forecast_time, level, ensemble, probability, all, none [-itime] GRIB files only; set initial time as a single element dimension (same as -sed initial_time) [-ftime] GRIB files only; set forecast time as a single element dimension (same as -sed forecast_time) [-tps] GRIB files only; remove suffix representing a time period (e.g. 2h) from statistically processed variables, leaving only type of processing as a suffix (e.g. _acc, _avg) [-v var1[,...]] user specified subset of variables [default: all variables] ncl_filedump can be used to determine desired variable names [-L] support for writing large (>2Gb) netCDF files [default: no largefile support] Note, though, from the ncl_convert2nc help page http://www.ncl.ucar.edu/Document/Tools/ncl_convert2nc.shtml -L Specifies that the resultant netCDF output file may exceed 2Gb in size on platforms that have "large file support" (LFS). However, no single variable may exceed 2Gb in the current implementation. You may need to slice out individual ensemble members for ncl_convert2nc to work on the TMP.2m.1871.grb type files. I hope that this helps. best wishes, gil

Hi Gil, After extracting the T2M data for one member only, ncl_convert2nc still fails with the same error (-L option makes no difference). CDO has no problems with converting the single member GRIB file to NetCDF. The variable name is wrong but this can be fixed. My guess now is that something is wrong with ncl_convert2nc. Regards, Camiel

Anonymous (not verified)

Fri, 02/21/2014 - 11:34

Yeah, I agree that it is better to read every single ensemble member out from TMP.2m.1871.grb type files and convert them to be nc files. It works for me in this way. But I use "cdo -f nc copy filename.grb filename.nc" to convert grb files to be nc files. Thanks.

Camiel Severijns (not verified)

Tue, 03/04/2014 - 04:26

I think I have found a problem with the longitude coordinate of the data files of the 20CR under http://portal.nersc.gov/archive/home/projects/incite11/www/20C_Reanalysis/everymember_grib_indi_fg_variables ncl_convert2nc fails to convert these files. CDO does convert them to NetCDF but after this the longitude coordinate values range from about -1.8 to 0 (from West to East). The latitude coordinates are correct. The CDO operator setgrid,t62 fixes this problem.

Camiel, Does your cdo returned netCDF file show grid_type = "gaussian" before you use setgrid,t62 ? When I use cdo on these files without the setgrid,t62 I see that metadata. The longitude coordinate goes from 0 to 358.125. But, we use "wgrib" to separate out each ensemble member as a separate grib file before passing to cdo. Perhaps the ensemble dimension is confusing cdo? Looks like by specifying the setgrid,t62 you have found a great workaround to a cdo problem! thanks for sharing this, best wishes, gil

Hi subramanyam , In order to help, anyone looking at this would need to have a link to the data you are trying to open. Grads and opengrads are similar in their capacity to open files. You may prefer to search the grads forum, then ask this question if it hasn't already been discussed. I did a quick google search and found:

http://gradsusr.org/pipermail/gradsusr/2012-December/033787.html

Good Luck

Dear Subramanyam, It looks like bug reports for opengrads are entered at http://sourceforge.net/p/opengrads/bugs/ I suggest submitting a detailed bug report, including the particular CMIP5 file that is causing the problem, and any error messages. Additionally, try downloading the CMIP5 files that are causing problems, or demonstrate that some other software opens it correctly in your bug report. best wishes,

Anonymous (not verified)

Mon, 03/09/2015 - 17:17

I have been puzzled about odd looking scales in some downloaded ERA-I netCDF files. However, I have found out about scale factors and add-offset. However, when I apply them to some datasets, e.g. temperature and mean SLP, the new "unpacked" values are quite obviously not right, and the original values were. I tried to check by downloading an equivalent grib file, converting to net CDF with cod then examining the new output values. They were the same as the packed version. This is confusing. Heat flux values on the other hand seem wrong in both packed and unpacked, as ocean values in the Arctic (Barents Sea) appear more negative than do those over the ice.

To assist you, anyone will probably need significantly more information. I suggest you start a page at reanalyses.org under Help (visible once authenticated) and describe precisely what you steps you followed and what values you are seeing. Including screen shots of the data access request and then the output of ncdump will be helpful.

I can offer some general advice, but as Gilbert Compo said, a precise answer would require more information. Firstly, beware that some software automatically unpacks netCDF data. Also note, that the scale factors and add-offset vary from variable to variable and file to file. ERA-Interim fluxes, defined to be positive downwards, are accumulated from the beginning of the forecast for +step hours, so you need to divide by the number of seconds in step to obtain units in "per second".

Dear Toihir, I'm not sure what a SAGE II V7 file has to do with reanalysis, but perhaps these suggestions will help. I suggest you consult the lead author of any reference you are using for the SAGE II data, or consult with the data center from which you procured the data. From a google search on SAGE II, I see that the SAGE II home page is http://sage.nasa.gov/SAGE2/ . Read software is provided at https://eosweb.larc.nasa.gov/project/sage2/sage2_v7_table . Both Fortran and IDL code are provided there, so some modification will be necessary for matlab. I suggest you consult with a local matlab expert about how to interpret either the Fortran or IDL. best wishes

Sebastian Krogh (not verified)

Wed, 12/02/2015 - 11:44

Hi, I trying to extract daily incoming shortwave from ERA-I, the variable is ssrd (Surface solar radiation downwards), which I downloaded from ECMWF website (http://apps.ecmwf.int/datasets/data/interim-full-daily/levtype=sfc/). The problem is that the daily values that I obtain from ERA-I are too low. ssrd comes in J/m2 and in a daily resolution (I cannot get higher temporal resolutions), and I get values up to 7 MJ/m2/d, in which values around 20+ MJ/m2/d are expected for mid-summer in this location (lat = 68N Lon 134W). Has anyone run into these problems with radiation (not able to download subdaily radiation and getting too small values). Any answer is appreciated, thanks Sebastian

These are not daily values. If you look above "Select parameter" you will see "Select step" and "Select time". Time (and date) are the start time of the forecasts (twice daily) and step is the number of hours into the forecast. Ssrd is an accumulated field, from the beginning of the forecast to the particular step. Steps are 3 hourly, out to 12 hours. However, note that further steps, out to 240 hours are available with batch access, see "Access Public Datasets" in the left hand menu. To convert Jm-2 to Wm-2, simply divide by the number of seconds in step hours ie step*60*60.

I'd like to access from 20CR (ver 2c) daily rainfall for a specific lat/lon reference point. Is there a relationship between the daily pr_wtr output variable and the monthly mean prate variable? Is it valid to compare actual daily rainfall with data derived from pr_wtr?

For GrADS, this page has an example http://www.jamstec.go.jp/frsgc/research/iprc/nona/GrADS/plot-strem-line.html You can use this plotting page http://www.esrl.noaa.gov/psd/cgi-bin/data/testdap/plot.comp.pl to plot zonal means of meridional winds (and omega) to look at the Hadley cell. NCL will also plot streamlines http://www.ncl.ucar.edu/Applications/stream.shtml Cathy Smith

Add new comment