Transitions Newsletter Header

Issue 2 | Summer 2013

Lead Story

THE 2013 HURRICANE WRF

Contributed by Ligia Bernardet

As the 2013 hurricane season continues in the North Atlantic and eastern North Pacific basins, a newly minted HWRF model is providing forecasts for the National Hurricane Center (NHC) on a new machine and with significant code additions. On July 24, the operational HWRF went live on the Weather and Climate Operational Supercomputing System (WCOSS). A research version for testing continues in use on the jet computers at the NOAA ESRL Global Systems Division. New, more efficient code promises to provide quicker processing, allowing timely forecasts and the opportunity to use more sophisticated physics routines.

HWRF simulated satellite image of TC Dorian

This year’s HWRF has several new features and options. Among the most significant are:

1. New data assimilation options. The HWRF can now assimilate wind information from the tail Doppler radar (TDR) on hurricane flight aircraft.

2. Use of hybrid data assimilation system, which allows better use of observations to initialize the model.

3. Increased code efficiency, which allows physics packages to run at 30 second intervals as compared to last year’s 90 seconds.

“Ambitious plans for HWRF in 2014 and beyond include new data and multiple moving nests.”

Additionally, this year’s HWRF public release, for the first time, supports idealized tropical cyclone simulations and hurricane forecasts in basins beyond the eastern North Pacific and North Atlantic.

The DTC conducts testing and evaluation of HWRF, and also serves as an accessible repository for HWRF code. Software version control assures that HWRF developers at EMC, GSD, and other operational and research institutions obtain consistent results. Particular attention has been paid to facilitate the inclusion of new research developments into the operational configuration of all HWRF components. For instance, updated model routines for the Princeton Ocean Model for Tropical Cyclones (POM-TC), developed at the University of Rhode Island, can be seamlessly introduced.

Ambitious plans for the HWRF in 2014 and beyond include code allowing multiple moving nests in the same run, additional targets for data assimilation (dropsondes, etc.), and post-landfall forecasts of ocean surge, waves, precipitation, and inundation.

See the HWRF v3.5a public release announcement in this issue.

 


Director's Corner

Bill Lapenta

Dear Colleagues,

The end-to-end modeling systems in the NOAA operational numerical guidance suite are scientifically based, and research results must and do cross the “Valley of Death” into operations. However, the operational and research communities need to make this journey more efficient and cost effective. That’s one reason why we have testbeds like the DTC. During 20 years as a research scientist at NASA, I had the opportunity to work closely with NWS forecasters in Huntsville and offices across the Southeast. When I accepted a job at NOAA with EMC, I thought my understanding of what it takes to work in an operational environment was sound based on these earlier experiences. However, it soon became apparent that my perceptions about the transition of research into operations were woefully incomplete. I believe that there are many ways NOAA can build a better transition process between research and operations, and I would like to share my thoughts associated with the upgrade process at EMC in future releases of the DTC Newsletter.

Bill Lapenta

 


Who's Who

Ligia Bernardet

When a DTC pioneers’ plaque is eventually installed, Ligia’s name will be on it. She was involved with DTC activities even before it formally existed, assessing the feasibility of high resolution ensemble systems with Steve Koch. At the moment she is principally working on tropical storm forecast models as lead of the DTC Hurricane Task. For insider information about the new HWRF release and its assessment, she is the first call to make.

Ligia’s interest in meteorology and numerical modeling was first developed at the University of São Paulo, Brazil, where as an undergraduate she helped provide rainfall forecasts for the local television station and the state environmental agency. As she tells it, this was a trial by fire certainly, but also an eye-opening experience with the operational side of meteorology that stuck with her, and she credits these forecasts as preventing many deaths and injuries in urban regions where landslides pose a very serious risk. After PhD studies at Colorado State University and a subsequent postdoc in Boulder, she returned to Brazil and the operational arena for a 2-year stint on a tiger team that got a numerical modeling system going in the National Weather Service. Since her return to Boulder in 2003, she has worked at the DTC and at the Global Systems Division of ESRL on several numerical model, data assimilation, and forecast evaluation projects, including projects to choose the dynamic core of the Rapid Refresh model and to improve the air-sea fluxes in Hurricane WRF.

Ligia remembers Pedro Silva Dias, one of her undergraduate professors, saying that the relationship between research and operations, via R2O, should be a glass door with an equal dose of O2R, not a ‘valley of death’ as it can sometimes seem. After these early experiences with operations and then several twists and turns along research paths, she is often amazed to have come nearly full circle at the DTC.

 


Visitors

Cold Pools and the WRF

Visitors: Robert Fovell and Travis Wilson
Contributed by Travis Wilson

Robert Fovell and Travis Wilson from the University of California/Los Angeles recently completed a visitor project titled “Improvements to modeling persistent surface cold pools in WRF”, aspects of which will be part of Travis’ PhD work. Travis spent nine months working at the DTC in Boulder, much of the time with his DTC host Jamie Wolff, and Rob visited for two weeks in March and June. A principal motivation for their study was the occasionally poor prediction in numerical models (including in WRF) of the formation and breakup of fog in the Central Valley in California and the possibility that better land surface models would improve those predictions. One significant result of their study is the development of a hybrid land surface model that convolves the complexity of the Noah land surface model’s soil moisture formulation with the simplicity of a thermal diffusion (slab) heat transfer model. Some of their results were presented at the recent WRF Users Workshop in Boulder and can be linked to at http://www.mmm.ucar.edu/wrf/users/ workshops/WS2013/ppts/4.4.pdf

Wave graphic

 


Community Connections

Object-based Verification at WPC

Contributed by Faye Barthild, NCEP WPC

The Weather Prediction Center (WPC) at NCEP has been using MODE to supplement its traditional verification techniques since April 2010. The Method for Object-based Diagnostic Evaluation (MODE), a utility that is part of the MET verification package, has been developed with substantial support from the DTC. Both are systematically expanded and maintained for specific DTC tasks and an array of outside users. MODE output is available to WPC forecasters in real time through an internal website that displays graphical verification for forecasts of 24 hr precipitation valid at 1200 UTC (see the figure). Forecasters can select the forecast lead time (Day 1 – 36 hr, Day 2 – 60 hr, or Day 3 – 84 hr) and precipitation threshold (0.50 in, 1.0 in, and 2.0 in), then view the corresponding verification for WPC forecasts and 9 numerical models.

“Two things that seem to resonate with our forecasters the most are the real time aspect of the website and the visual nature of the comparison.”

The graphical nature of the MODE verification allows for a quick comparison of forecasts in a way that goes beyond traditional threat scores and bias values to consider other measures of forecast quality (distance between forecast and observed objects, differences in angle of orientation and object size, etc.). The most recent update to the website attempts to better quantify these qualities by adding statistical comparisons of the interest value and the displacement distance between matched objects to complement the traditional graphical comparisons. Future plans include additional statistical information on the website, including longer term summaries (monthly, annually, etc.), and making the website available to the public.

Running MODE on a national scale at an operational center like WPC can present some unique challenges since MODE must be able to correctly identify precipitation objects from meteorological phenomena as varied as cool season synoptic scale storms to warm season convection. Determining the ideal configuration is still a work in progress, but it is an essential piece of the puzzle in order to build forecaster confidence in the utility of object-based verification.

 


Did you know?

Did you know....

Contributed by Adam Clark

Through its Visitor Program, the DTC is currently working with Adam Clark, a scientist at the Cooperative Institute for Mesoscale Meteorological Studies (CIMMS), on a project that involves using the Method for Object-based Diagnostic Evaluation – Time Domain (MODE-TD) for identification, tracking, and visualization of simulated supercells in high-resolution models, which will be applied and tested during annual NOAA/Hazardous Weather Testbed Spring Forecasting Experiments.

Supercells are identified using updraft helicity (UH) extracted at 5-minute intervals from an experimental 4-km grid-spacing version of the WRF model run in real-time at the National Severe Storms Laboratory (NSSL), which is known as the NSSL-WRF. The UH extraction is done using a newly developed technique that minimizes data volume, and supercells are defined based on maximum intensity and longevity criteria applied to UH objects identified by MODE-TD. Geographic information and object attribute information are then combined into the GeoJSON file format and displayed using an experimental web interface developed in collaboration with Chris Karstens of CIMMS/NSSL, known as the NSSL Experimental Data Explorer. The image above is a screenshot from the data explorer showing the path as depicted by UH and associated attributes of a simulated supercell over central Oklahoma 19 May 2013.