Transitions Newsletter Header

Issue 21 | Spring 2020

Lead Story

In Memoriam: Bill Lapenta

Contributed by Louisa Nance

The DTC community mourns the passing of William “Bill” Lapenta, Ph.D. Bill was the Acting Director of NOAA’s Office of Weather and Air Quality (OWAQ) within NOAA’s Oceanic and Atmospheric Research that supports world-class weather and air quality research. He was also the guiding force and energy behind the Earth Prediction Innovation Center (EPIC) with the goal to launch the U.S. forward as the world leader in numerical weather prediction through public-academic-private partnerships. He was committed to conquering the Research to Operations divide.  Bill’s connections to the DTC date back to its early days. While the Director of EMC, Bill served as a DTC Management Board member and eventually transitioned to serving as the lead for the DTC Executive Community when he became the NCEP Director.

Bill had already prepared his presentation on EPIC for the American Meteorological Society Annual Meeting in January in Boston. DaNa Carlis presented on his behalf, followed by remarks from Acting NOAA Administrator Neil Jacobs. In his presentation, Bill illustrated how public awareness of modeling was raised when the European model predicted Hurricane Sandy would make a hard left turn into the NE U.S. He shared EPIC’s goal to advance numerical guidance skill, reclaim and maintain international leadership in NWP and improve the research to operations transition process. 

Bill then outlined how EPIC would fulfill this goal - by leveraging the weather enterprise and existing resources within NOAA, enabling scientists and engineers to effectively collaborate, strengthening NOAA’s ability to undertake research projects, and creating a community global weather research modeling system. 

Bill knew it was important to establish strong partnerships with academia, the private sector, and other federal agencies that share common goals and values, and that open communication would connect leadership, programs, and scientists across organizational boundaries to deliver the best forecasts possible to America. Bill’s energy and leadership to bridge organizations inspires us to carry on with his EPIC vision.

Bill Lapenta

 


Director's Corner

A Look Ahead

Dorothy Koch
Contributed by Dorothy Koch

 

At the National Weather Service’s Office of Science and Technology Integration (OSTI), we’re currently planning milestones for the coming year with our partners. OSTI manages a broad portfolio of research and development efforts designed to improve operational forecast capabilities, and our partners in these efforts are in both the research and operational communities. We aim to understand forecaster priorities, and to communicate operational requirements so that community members can come up with innovative solutions. Building connections across these diverse communities has meant learning new skills. We are accustomed to rigorous testing and evaluation processes before operational transitions, but are still figuring out how to ensure that researchers working in federal laboratories, academic organizations and industry can easily get, run, and improve our forecasting applications. We’re currently working with organizations that have experience in this area, such as the DTC and community modeling groups.

It’s a time of great change and opportunity; it will take some time to establish effective communication and infrastructure across forecaster and researcher communities, but once established, we will be in a powerful position to accelerate the improvement of the nation’s forecast systems.

An essential element in our strategy is to shift from running a “quilt” of many separate, diverse software applications to running a Unified Forecast System, the UFS.  The UFS can produce different kinds of forecast guidance (e.g., medium-range weather prediction, hurricane forecasts, subseasonal to seasonal predictions, severe-weather, air-quality) as different configurations of a single overall system, with shared infrastructure and scientific components. The unified approach reduces the amount of software that needs to be maintained and increases the physical consistency across timescales. We plan to retire or “sunset” many legacy codes and absorb their functionality into the UFS. Since there are dozens of codes in the quilt, it will take some years to fully transition to the streamlined, unified system.

One of the recent milestones we are particularly excited about is the release of the UFS medium-range weather application to the research community in March 2020. This ties together some of our big goals: it is the very first public release of the UFS and is the first release that has the research customer as its primary target. It is based on the Global Forecast System v15 software that was transitioned to operations in June 2019. The UFS organization is making major changes in how code is managed and the release process. All the code is on GitHub, so it is easy to access. There are teams that are ensuring that it is portable to computer platforms outside of NOAA and that users are able to get their support questions answered. Although this first release is necessarily limited in scope and options, it put some of the key elements in place for the planned Earth Prediction Innovation Center (EPIC) (a program anticipated to launch from the Weather Program Office - formerly the Office of Water and Air Quality - later in FY2020). EPIC will be making even greater strides toward improved software usability and more effective “Research to Operations and Operations to Research” or R2O2R.

There are a host of updates to operational codes planned for the coming year as well. We are expecting the final implementation of the High-Resolution Rapid Refresh short-range weather application in spring 2020 (HRRRv4) using WRF-ARW. A coupled space weather Whole Atmosphere Model - Ionosphere Plasmasphere Electrodynamics is planned for implementation in early 2021 (WAM-IPEv1), as well as an update to the Global Ensemble Forecast System in fall 2020 (GEFSv12). GEFSv12 includes separate aerosol and wave components, and will be a coupled UFS implementation. Those are just a few examples; there are about ten implementations planned in all for FY20. We are hoping to finalize the selections and schedule in the next few weeks.

It’s a time of great change and opportunity; it will take some time to establish effective communication and infrastructure across forecaster and researcher communities, but once established, we will be in a powerful position to accelerate the improvement of the nation’s forecast systems.

Dorothy Koch, National Weather Service's Office of Science and Technology Integration (OSTI).

 


Who's Who

LinLin Pan

NOAA

LinLin was born into a family of farmers in Southeast China where the weather is dominated by the Asian Monsoon. “I can’t remember how many times my parents sighed gravely in front of the ruined crops because of the wrong prediction of Mei-yu – endless rainy days during the summer. My family would starve in this case.” LinLin was determined to study weather to help farmers, and now he works to make sure severe weather events are forecasted accurately.

Diligent and determined, Linlin was accepted into the Department of Geophysics at Peking University, one of the best universities in China. After a few years of teaching, lecturing, and research Linlin came to the U.S. to expand his interest in weather forecasting and modeling. Linlin earned his Ph.D. in Meteorology at the University of Hawaii at Manoa within 5 years and came to Boulder to work at NCAR for 10 years before landing at NOAA. “Now I am working at the NOAA Global Systems Laboratory (GSL) with the Common Community Physics Package (CCPP), FV3-Stand-Alone Regional model, and physics across scales. My typical daily tasks include running the model, investigating model outputs, and providing user support for CCPP related questions. I am currently working to use CCPP in the coupled model.” One of LinLin’s biggest challenges is to help folks using other platforms to run the model outside of NOAA. “It runs fine on NOAA HPC, so we need to help them debug,” says LinLin.

“If I was not a meteorologist, I would become a builder or constructor.” Linlin likes to do yard work and help around the house in his spare time. Other fun facts are that he would like to travel to the Arctic and Antarctic to study ways to help animals survive if our country suddenly turned icy cold. His favorite quote is “the early bird catches the worm,” and he wishes he had the power to predict the future.

LinLin Pan

 


Bridges to Operations

MET Collaboration

Contributed by Jason Levit

The NOAA Environmental Modeling Center (EMC) and the Developmental Testbed Center (DTC) are currently collaborating on using the Model Evaluation Tools (MET) for the verification and validation of EMC’s suite of environmental prediction models, such as the Global Forecast System (GFS), the Global Ensemble Forecast System (GEFS), and the Rapid Refresh (RAP)/High Resolution Rapid Refresh (HRRR). Both centers are currently working towards creating an operational configuration of MET that can be implemented on NOAA’s Weather and Climate Operational Supercomputer (WCOSS) to be used in real-time within a 7x24x365 operational environment. 

To that end, EMC and DTC have worked with NCEP Central Operations (NCO) to install METplus 2.1 and MET 8.1 on the developmental component of WCOSS to test and optimize the software system, with the eventual goal of installing METplus 3.0 and MET 9.0 into real-time operations in calendar year 2020. Once installed, the software will enable EMC to create a suite of real-time verification systems that will provide statistics on EMC model performance to both internal and external customers. Additionally, the real-time verification statistics will also be used to create graphics and displays with a cloud-based METViewer and METExpress user interface.

Image created using METplus for the GFSv15 vs GFSv16 500mb anomaly correlation comparison.

 


Visitors

Evaluating the Impact of Model Physics on Forecasts of Hurricane Rapid Intensification

Visitor: Jun Zhang
Contributed by Jun Zhang

Dr. Jun Zhang, a scientist from the University of Miami and visitor to the DTC in 2018, investigated the impact of model physics on the forecast performance of hurricane models for hurricanes undergoing rapid intensification (RI). Accurate predictions of hurricane intensity could significantly reduce the economic loss, especially if a hurricane makes landfall at well-developed coastal regions. 

Hurricane intensity is influenced not only by environmental factors but also by internal dynamics and thermodynamics. Previous research based on statistical modeling suggested that around 35% of the skill of predicting hurricane RI in the Atlantic basin is explained by processes related to the large-scale environment. What remains challenging is to realistically represent inner-core processes, especially in the physical packages of the hurricane models. As the horizontal resolution of the operational hurricane models such as the Hurricane Weather and Research Forecast (HWRF)  approaches 1.5 km, the performance of physics traditionally used in low-resolution models should be evaluated. Dr. Zhang’s research project focused on evaluating the impact of model physics in HWRF on hurricane RI prediction.

Dr. Zhang worked with DTC staff to design numerical experiments for this project. They created extensive HWRF forecasts with two different cumulus schemes.  The team also decided to use some existing HWRF forecasts from the Environmental Modeling Center (EMC) for evaluating the impacts of other physics on RI prediction.  

Dr. Zhang split the HWRF retrospective forecasts into four groups: captured RI (Hit), missed RI (Miss), and predicted RI with false alarm (False Alarm). For each physics component, he evaluated the model’s performance for RI prediction by building a contingency table that summarizes the number of each group and calculating the Critical Success Index.  For a given component of model physics that shows substantial improvement in the RI forecast, he also conducted a detailed analysis of the TC structure to understand why the changes in model physics make the RI forecast better.  

Horizontal view of convective burst locations during the period between 48 and 53 h of forecast time for HWRF forecasts of Hurricane Earl (2010) initialized at 1200 UTC 27 Aug 2010 with high-Km and low-Km boundary-layer physics. The red arrow indicates the shear direction. The green arrow indicates the tilt direction. Note that Km represents the vertical eddy diffusivity. RMW is the radius of maximum wind speed. This figure is taken from Zhang and Rogers (2019).

 

By analyzing these HWRF forecasts, Dr. Zhang found that both the cumulus and boundary layer schemes have substantial impact on HWRF’s RI prediction skill, while the impact of horizontal diffusion parameterization is relatively small. His case study of Hurricane Gonzalo (2014) showed that the Grell-Freitas cumulus scheme performs better in terms of hurricane structure forecast than the Simplified Arakawa-Schubert scheme. Another case study of Hurricane Earl (2010) showed that the strength of vertical turbulent mixing in the boundary layer regulates the vortex- and convective-scale structures and their interaction with the environmental wind shear. This multiscale interaction process is found to be crucial for hurricane intensification, which is recommended by Dr. Zhang to be considered in future physics evaluation and upgrades.

Dr. Zhang enjoyed his visit to DTC and found very valuable collaborations with DTC colleagues.  DTC provides a friendly environment for him. DTC scientists are very knowledgeable about model development and verification and provided great support for Dr. Zhang’s project.  The next step of his project is to analyze idealized HWRF simulations created by DTC in order to understand how model physics affects hurricane intensification dynamics. 

Jun Zhang

 


Community Connections

Communication and Outreach in the Unified Forecast System

Contributed by Cecelia DeLuca

The Unified Forecast System (UFS) is a coupled, comprehensive Earth modeling and data assimilation system that will be used in NOAA operations and by the research community. There are naturally a lot of questions about UFS from potential collaborators, for example: What UFS codes are available to run? How do I find out more information about the project, and how can I get involved?

One of the roles of the UFS Communication and Outreach Working Group (C&O WG, for short) is to communicate the answers to such questions. Members include physical scientists, social scientists, NOAA public affairs representatives, managers, and software developers. The C&O WG also established a “UFS Focus Group,” which is a diverse collection of 50+ people, populated by graduate students, scientists, field officers, and others, who have volunteered their time to review and test UFS products.

The first product generated by the C&O WG was the UFS Portal, at ufscommunity.org. Launched last year following a Focus Group review, the Portal is a one-stop-shop for all things UFS: news, upcoming events, highlights of current activities, documents, and plans. Looking for answers to the questions above? Information about the March 2020 release of the UFS, the  Medium-Range Weather Application 1.0, is available on Portal along with a link to support forums. There is an overview of all UFS applications (e.g. short-range weather, sub-seasonal to seasonal prediction, space weather), a description of the UFS governance structure and working groups, and information about how to get involved.

The role of the C&O WG extends beyond the Portal to more general aspects of UFS communications. Making sure that UFS participants have a space where they can work together easily is critical, which is why the C&O team collaborated with the NOAA Environmental Modeling Center (EMC) and their UFS partners to set up a GitHub organization, repositories, and wikis for UFS applications. Making the code available, testing it, and sharing documentation are key aspects of community participation. The ufs-community organization on GitHub, at https://github.com/ufs-community/ufs/wiki, is how UFS releases are being distributed.

The C&O WG is also working with community members to understand and document the usability of UFS code. To do this, the C&O WG has engaged with UFS software developers to prepare and distribute  “Graduate Student Tests” or GSTs. The UFS project defines the GST as the ability of a student to easily get code, run code, change code, test code for correct operation, and compare and evaluate results. The most recent GST was released with the Medium-Range Weather Application, and it is open to all - not just graduate students!  Evaluators get, build, and run an example, change a physics parameter, rerun, and visually compare results, in less than six hours. Then they fill out a questionnaire about their experience. Results will be shared on the Portal and the feedback used to improve future releases. Like other C&O WG activities, the GST encourages communication that will help to advance UFS scientifically and computationally.

Interested in participating in a GST? See https://ufscommunity.org/index.html#/science/gst

 


Did you know?

CCPP Framework

Is to be Used for Future Atmospheric Chemistry Modeling at NCAR
Contributed by Andrew Conley and Mary Barth

 

The Atmospheric Chemistry Observations and Modeling laboratory at NCAR has been developing the Model Independent Chemistry Module (MICM) to represent chemical transformations in any atmosphere model. To make this happen, MICM is designed to work with the Common Community Physics Package (CCPP) Framework, a software defining how host models access physics parameterizations including chemistry.

MICM gathers information on trace gases and chemical reactions from a database (coined the “Chemistry Café”). It then configures a set of reactions (chemical mechanism) and provides information on how the chemical transformations will be solved with a set of ordinary differential equations as well as how those gases are treated in the host atmosphere model. A data-flow diagram of this process is shown below. This suite of operations creates Fortran code, metadata, and namelist-type information for simulating the chemistry in the atmospheric model. The generated code is compliant with the CCPP Framework allowing MICM code to work with any atmosphere model that is also CCPP compliant.

MICM and its ability to function using the CCPP Framework has been tested with a simple chemistry scheme (5 reactions) in a chemical box model configuration. The box model successfully predicted the trace gas mixing ratios through several diurnal cycles. The MICM-generated Fortran code has also been successfully tested in the Whole Atmosphere Community Climate Model (WACCM) by “piggybacking” MICM trace gases with the WACCM full chemistry scheme (but not using CCPP yet).

Unique aspects of MICM include the use of a web service for the mechanism preprocessor, which allows easy maintenance and development. The Chemistry Café database, which includes trace gas properties along with chemical reactions, provides a single source of information that ensures consistent and traceable values between models and simulations.

Model Independent Chemistry Module (MICM)

MICM will become the core of the NCAR/ACOM chemistry parameterizations implemented in the Multiscale Infrastructure for Chemistry and Aerosols (MUSICA), a global chemistry transport model with regional refinement. MUSICA development is being implemented by the atmospheric chemistry community and is led by NCAR/ACOM scientists and software engineers.