CRTF Telecon - 11-19-2014

Created by william.chong on - Updated on 07/18/2016 10:13

 

NOAA Climate Reanalysis Teleconference

 

19 November 2014, 2-3:05pm EST

Recording: https://mapp.adobeconnect.com/p16kx3ra5jj/

Rapporteur: Steve Penny, University of MD

Theme: Ocean Reanalysis

2:00-2:10 Welcome and introduction of new co-Lead: Suranjana Saha of NOAA NCEP Environmental Modeling Center

and new members:

1. Daryl Kleist (University of MD, Atmospheric Data Assimilation)



2. Steve Penny (University of MD, Ocean Data Assimilation)



3. Xu Li (EMC/NCEP, Atmospheric and Ocean Data Assimilation )



4. Arun Chawla (EMC/NCEP, Wave Data Assimilation)



5. Xingren Wu (EMC/NCEP, Seaice Data Assimilation)

Gil Compo, U of Colorado/CIRES & NOAA/ESRL Physical Sciences Division

2:10-2:35 JPL/ECCO ocean reanalysis activities. Dimitris Menemenlis, Jet Propulsion Laboratory, Cal Tech

2:35-3:00 Development of a near surface sea temperature (NSST) within the NCEP GFS/CFS systems. Xu Li, NOAA NCEP Environmental Modeling Center.

3:00-3:05 Plan for short reports on progress on selected Foci.

 

Adobe Connect:
1. Click the link below or copy and paste the link to a browser:
2. Enter as a Guest with your name. Click "Enter Room".
3. Dial-in to the teleconference line via phone.
4. To share your webcam, click the 'webcam icon' at the top of the window and "Start My Webcam".
 
To hear the audio:
Domestic - 866-710-6541
International - 203-280-9279
Participant Passcode: 5841149
 
Notes from the Teleconference:

Suru introduction:

 

The main effort now is the Climate Forecast System version 3 (CFSv3) development. CFSv2 took 7 years to implement and

includes: Atmosphere, Ocean, Land and Sea Ice. The CFSv2 utilizes an atmospheric resolution of T574 for the operation analysis, and T126 for forecasts to 9 months.

 

For CFSv3, there is a planned increase in resolution in the horizontal and vertical, as well as a tentative increase in forecast time to 1 year. The EMC is unifying global systems GFS, GEFS, CFS, to 1 unified system under NEMS. CFS will be coupled to new components, adding waves and aerosols in next version CFSv3.

 

Will attempt fully coupled by end of this round.

 

New official task force members:

 

1. Daryl Kleist (University of Maryland, Atmospheric Data Assimilation)

10 years experience at the EMC

Instrumental in development of the GFS hybrid.

Currently on the UMD faculty.

Working in areas of algorithm development. 

 

2. Steve Penny (University of Maryland, Ocean Data Assimilation)

Currently on faculty at UMD.

Visiting scientist at NCEP.

Developing hybrid ensemble/3DVar ocean data assimilation system at NCEP

Developing strongly coupled data assimilation system with the CFSv2 for India’s National Monsoon Mission

Leading development of nested regional/global hybrid data assimilation system for the Indian Ocean

Developing improvements in the fundamental theory of data assimilation.

 

3. Xu Li (EMC/NCEP, Atmospheric and Ocean Data Assimilation )

Improving ocean component of NWP.

Improving SST representation inside the mixed layer.

Further details will come in today’s presentation.

 

4. Arun Chawla (EMC/NCEP, Wave Data Assimilation)

Plans to couple Wavewatch spectral model with ocean and atmosphere.

A coupled ensemble data assimilation system is planned.

Research interests are to investigate stokes drift and Langmuir mixing.

 

5. Xingren Wu (EMC/NCEP, Seaice Data Assimilation)

Sea Ice for global forecast and CFS. Participate in CFS reanalysis, for sea ice and other products.

 

Dimitris (JPL), Echo reanalysis activity:

The presentation will discuss project highlights from the 2014 ECCO meeting. ECCO stands for: Estimating the circulation and climate of the ocean, and was started by Carl Wunsch and collaborators at MIT in late 1990’s. A key objective of ECCO came as part of data collection during WOCE. Carl was looking for a way to use that information in the best way. It was clear that information was not complete and with the advent of TOPEX Poseidon it was clear you needed to use models to interpolate to regions and locations with no observations. Unlike operations, it was aimed toward climate studies. ECCO pursued methodologies that preserve properties in time and would not have discontinuities, following the dynamical equations more exactly but not necessarily fitting the observations as a primary goal. Generally, the ECCO effort has tried to make the models consistent with the observations rather than nudging or pushing to the observations. The ECCO effort is using the MIT GCM. ECCO uses the adjoint a lot, and has put effort into developing automatic differentiation tools. One of these has become a commercial tool, and another is currently being developed as an open source automatic differentiation tool: OpenAD.

 

The ECCO team has put much effort to moving toward a new grid in the past few years: lat/lon/cap grid

Previously, the cubed sphere grid was used: nice in arctic, but people ultimately wanted it on a lat/lon grid for analysis and comparison. They interpolated, but the lat/lon/cap is more efficient for end users.

Other highlights from the presentation (see slides for more detail):

Sea ice modeling studies: new ECCO solutions have fully dynamical thermodynamics with constrained state

MIT/GFDL common code repository

Eddying global-ocean and sea-ice adjoint-method optimizations

Current Cubed Sphere CS510, 18km grid spacing

Ocean ecology and biogeochemistry

Ice sheet modeling and coupling with the ocean model

Impact of tides on large-scale ocean circulation parameterization of tides fails at higher-and-higher resolutions. (e.g. antarctica slope front at the tip of the antarctic peninsula)

questions:

Gil:

AMOC vs. RAPID observations. (slide 3). If we’re going to characterize this circulation, then ‘what is truth’? The difference between RAPID estimates and ECCO are a little troubling.

Dimitris:

This is published work by Wunsch and Heimbach. The RAPID array is not exact, it’s a fit to a very simple model. They made informed choices about where they put their arrays, but like all data will have some error. The MITgcm model doesn’t represent a lot of processes, and has errors and drift. The differences are the ’science’ that we are trying to understand. We’re trying to understand the climate problem indicated by these differences.

Gil:

2004-2006 differences compared to the 2008-2012, is there something wrong with the red curve, or something wrong with how the data is presented.

Dimitris:

Speculation: the argo array is spinning up during this period. the ECCO ocean state estimate relies very heavily on Argo. So ECCO may have missed some of the actual ‘real’ spikes in the earlier period. The most interesting part is to understand and explain these differences.

Yan:

AMOC time series: big lull between 2009-2011. what happened here?

Dimitirs:

I will ask Patrick.

Jim: 

Very low NAO in that winter, and the anomaly may be an echo of that signal.

Xu Li: The development of the NSST within the NCEP GFS/CFS

Extension of the SST

In the current system, SST is the only oceanic variable used in NWP.

SST is currently used as an input to the NWP and used only as a lower boundary condition, there is no feedback (yet).

The existence of the mixed layer often is used to simplify the handling of the upper ocean.

We introduce the diurnal thermocline layer warming and thermal skin layer cooling in the Near-Surface Sea Temperature (NSST) model. (slide 4)

The product is essentially a piecewise linear temperature profile

The foundation temperature is essentially the base of the mixed layer, which varies in location and time.

The form is: foundation temperature + depth-dependent departures.

Scientific basis:

1) more realistic ocean

2) more effective use of observations: more observations, and assimilation of AVHRR in combination with in situ,

using the data assimilation capability of the GSI

3) More consistent initial condition, and will provide a prognostic SST.

Not yet coupled to the ocean model. (but coupling with the Hybrid-GODAS is planned)

The NSST is a coupled analysis, where the GHRSST is not.

Evaluation of the SST:

(Not showing impact on weather forecasting, or satellite analysis)

Results in the presentation show:

Bias to own analysis, bias to buoy observation, RMS to own analysis, and RMS to buoy observation

Gil question:

This is for a particular region?

No global

ok so that’s why the bias is small…

The analysis is our own system, and we compare the forecast against the analysis. the analysis is system-dependent, but the buoy is not.

The SST is assimilated in the atmospheric system - one reason, it facilitates moving to coupled data assimilation.

CFS:

The coupled data assimilation is ‘weakly coupled’, not ‘strongly coupled’.

The SST may not give an improvement over the persistent prediction at first, since the model bias may have a negative influence.

We would like to couple with OGCM so the SST foundation temperature evolves with time, consistent with in situ observations.

In current coupled prediction, the top layer mean temperature of the model is used as SST to the atmosphere. We can use our NSST profile to relate the first layer temperature to foundation, and use that to derive the interface SST.

Eventually, the Hybrid ocean data assimilation will couple with the NSST and the atmosphere NWP in the next version of the CFSv3.

Questions:

further questions will be added as comments on the agenda page.

Gil:

I have several questions about the procedure you are using, that will be asked in the comments offline (of the teleconference - see below).

would like:

Short reports on selected foci for new members.

Take a subsequent agenda and volunteer to make a report related (try to link, even if loose) to the foci.

 

gilbert.p.comp…

Wed, 11/19/2014 - 14:08

Xu, Thanks again for your presentation. How you are obtaining the foundation temperature in the NSST analysis? Do you have some specified background error covariance to spread out the information in space? If necessary, you can start a new page on reanalyses.org that details the procedure and then post the link here. best wishes, gil

Gil,

The foundation temperature (Tf) is the analysis variable added to GSI for NSST.
So, it is analyzed with the observations (all of them are related to Tf by observation operator) and the background or first guess.

And it is done in GSI together with atmospheric analysis variables by minimizing a single cost function.

Yes, a background error covariance is used for the analysis, it is just the same as what used in NCEP RTG SST analysis right now.
Obviously, it can be improved.
One good way is just to take advantage of Hybrid EnKF GSI by extending the analysis variable to include the oceanic one for the Ensemble part.
The Tf analysis is done at the static analysis only at present.

I will think about to start a new page, as you suggested.

Thanks!

Xu

Add new comment

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.