Transitions Newsletter Header

Issue 20 | Autumn 2019

Lead Story

An Overview of the Earth Prediction Innovation Center (EPIC)

Contributed by Leah Dubots, OWAQ Pathways Intern

The Earth Prediction Innovation Center, or “EPIC,” will advance Earth system modeling skills, reclaim and maintain international leadership in Earth system prediction, and improve the transition of research to operations (R2O) and operations to research (O2R) within NOAA by working closely with partners across the weather enterprise. 

EPIC’s legislative language is included as an amendment to the Weather Research and Forecasting Innovation Act (WRFIA) of 2017 (Public Law 115-25) in the National Integrated Drought Information System Reauthorization (NIDISRA) of 2018 (Public Law 115-423). The law states that EPIC will “accelerate community-developed scientific and technological enhancements into the operational applications for numerical weather prediction (NWP).” To achieve this goal, EPIC will 

  • leverage available NOAA resources and the weather enterprise to improve NWP; 
  • enable scientists and engineers to effectively collaborate; 
  • strengthen NOAA’s ability to perform research that advances weather forecasting skills; 
  • develop a community model that is accessible by the public, computationally flexible, and utilizes innovative computing strategies and methods for hosting or managing all or part of the system;
  • and is located outside of secure NOAA systems. 

EPIC builds on the Next-Generation Global Prediction System (NGGPS), which supports the design, development, and implementation of a global prediction system. The NGGPS will address growing service demands and increase the accuracy of weather forecasts out to 30 days. The goal of NGGPS is to expand and accelerate critical weather forecasting R2O by accelerating the development and implementation of current global weather prediction models, improve data assimilation techniques, and improve software architecture and system engineering.

A critical component of the EPIC is to support a community developed, coupled Earth modeling system, known as the Unified Forecast System (UFS). EPIC will be the interface between the community (both internal and external) and aid in the advancement of scientific innovations to the UFS and facilitate improvements in the R2O process by providing access to NOAA’s operational modeling code for co-development outside of the NOAA firewall. EPIC will enhance the research and development process by providing access to the UFS using a cloud-based infrastructure for development. EPIC will allow community members to conduct research and development through multiple architectures, whether they are cloud-based environments or traditional high-performance computing environments. 

The Earth Prediction Innovation Center, or “EPIC,” will advance Earth system modeling skills, reclaim and maintain international leadership in Earth system prediction, and improve the transition of research to operations (R2O) and operations to research (O2R) within NOAA by working closely with partners across the weather enterprise.

EPIC is managed in the Office of Weather and Air Quality (OWAQ) within NOAA’s Oceanic and Atmospheric Research (OAR) Line Office. An EPIC Vision Paper was released that outlines seven core investment areas, including software engineering, software infrastructure, user support services, cloud-based high-performance computing, scientific innovation, management and planning, and external engagement. NOAA also signed a Memorandum of Agreement (MoA) with the National Center for Atmospheric Research (NCAR) to support infrastructure development for a UFS community model. 

The EPIC Community Workshop, hosted by OWAQ and held 6-8 August 2019, was attended by over 180 members of the community. The workshop provided an opportunity for members of the weather enterprise to participate in EPIC’s strategic direction, especially sharing ideas about potential business models, governance structures, priority areas of funding, and how to initiate EPIC. Community members recommended that EPIC be located external to NOAA and exist in a physical location. Community members agreed that the highest priority funding areas are user support services, computing resources, and software engineering. Community members also developed EPIC mission and vision statements, which are below: 

       Community-developed Mission: Advance Earth system modeling skill, reclaim and maintain international leadership in Earth system prediction and its science, and improve the transition of research into operations.

       Community-developed Vision: Create the world’s best community modeling system, of which a subset of components will create the world’s best operational forecast model.

As EPIC progresses, the program is dedicated to fostering a collaborative community environment; providing transparent and frequent program updates; and being responsive to the needs of the community. 

 

For Further Reading: 
Legislative Language
The Unified Forecast System
Next-Generation Global Prediction System (NGGPS)
Earth Prediction Innovation Center (EPIC)
* View the NOAA-NCAR MoA, EPIC Vision Paper, and EPIC Community Workshop Strategy, Summary and Recommendations PowerPoint on the EPIC Webpage. Check back frequently for program updates, additional materials, and ways to get involved.
EPIC Community Workshop Article

For Questions Please Contact:

DaNa Carlis, PhD, PMP–OWAQ Program Manager for EPIC and NGGPS, dana.carlis@noaa.gov
Krishna Kumar, PhD–OWAQ Program Coordinator for EPIC, krishna.kumar@noaa.gov
Leah Dubots–OWAQ Pathways Intern supporting EPIC, leah.dubots@noaa.gov

Earth Prediction Innovation Center (EPIC) Workshop, Boulder, Colorado, Aug 6-8, 2019

 


Director's Corner

A NOAA-NCAR Collaboration Toward Unified Modeling

Chris Davis
Contributed by Chris Davis

What does unified modeling mean to you? Perhaps an obvious meaning is that “unified” implies
one single model: one choice for dynamics, physics, data assimilation, and postprocessing.
Everyone uses the same code: universities, private companies, research labs, and operational
prediction centers. It is a grand vision. As a community, we are not there yet.

I argue that there is a more practical, and potentially more useful definition of unified
modeling: codes are easily shared and interoperable.

Why do I say this? At first, this approach might seem to perpetuate the lack of coordination in
modeling efforts across the US, which has prevented our field from achieving its potential. But
the solution to better coordination is probably not one model for everything.

Consider this question: if all researchers and agencies began with exactly the same code today,
how long would it take for that code to diverge? The answer is probably a few days or less. A
single code is not a realizable state without a mechanism to ensure compatibility and to
effectively manage innovations. The real question is how do we create an infrastructure where
all components can be interchanged with minimal effort?. A common framework, with agreed-
upon best practices in software engineering, is essential to minimize the time from innovation,
to testing and evaluation, and potentially to operations.

A co-developed infrastructure defines the core of the NOAA-NCAR collaboration toward unified
modeling, as described in the Memorandum of Agreement (MoA) finalized in 2018. The MoA’s
seven areas of infrastructure were outlined in the Spring 2019 edition of DTC Transitions. Co-
development requires reconciling requirements among different agencies or institutions
through careful design and the use of accepted standards such as the Earth System Modeling
Framework (ESMF) and the National Unified Operational Prediction Capability (NUOPC). Co-
development recognizes that multiple organizations, in this case NOAA and NCAR, have similar
modeling goals, and wish to reduce duplication of effort by aligning resources.

The vision for unified modeling can be summarized by examples. A researcher at a university
develops a new representation of cloud physics that is designed to work across a wide range of
length scales. They do their initial development using the Weather Research and Forecasting
(WRF) model but want to test this code in the Unified Forecast System (UFS). This switch
becomes trivial from a software engineering perspective because the code followed standards
for compatibility established as part of the Common Community Physics Package. Another
example is a researcher who wishes to isolate the ocean dynamics in the Modular Ocean Model
(MOM6) from the UFS by prescribing the atmospheric forcing. Third, an operational researcher
wants to adapt object-based evaluation methods to forecast output on seasonal time scales.
These are all examples that are made very tractable through unified modeling, and as
envisioned through a compatible infrastructure. Notably, the vision includes a pathway for
innovations that are initially completely separate from a particular codebase. Thus, revolution,
in addition to evolution, is possible.

It is important to remember that the goal of unified modeling places some additional
responsibility on developers to follow software engineering best practices and design
specifications. Unification also makes no explicit reference to the science needed to make
things work. With the right infrastructure, most of the time can be spent on valuable scientific
analysis instead of wrangling with gnarly portability issues. The vision for unified modeling is
thus not a single code, but a system that emphasizes scientific collaboration. Such collaboration
will be essential to overcome the challenge of predicting our complex Earth system.

Chris Davis, NCAR

 


Who's Who

Kate Fossell

Tell us a little about yourself and your career path. I’m a Wisconsin native and grew up fascinated by severe storms. Naturally, when it came time to think about college, I looked for meteorology programs in the midwest. I settled on Saint Louis University for my B.S. in Atmospheric Sciences and then earned my M.S. in Mathematics from the University of Wisconsin - Milwaukee (UWM).  After graduation, I stayed on at UWM as a researcher while also working for a UWM-affiliated meteorological consulting company called Innovative Weather. The job required 24/7 weather forecasting shifts, and I learned a great deal about weather communication and client interactions. But, I also realized that I enjoyed the research job much more.  On a whim, I applied for a software engineer job at NCAR and somehow landed an interview. Not surprisingly, they wanted an actual software engineer and not a scientist pretending to be one. But, another associate scientist position opened up that was a better fit, and I’ve been here eight years.

What does a typical day look like for you, and what are you currently working on? I work from my Colorado Springs home, so my days are typically spent on email, video calls, and working on projects on my computer.  I look forward to my one-day-a-week in the Boulder office. It’s usually jam-packed with meetings, but it’s great to see coworkers and have some face time.  I’ve worked with the NCAR Ensemble team for six years designing and supporting real-time ensemble forecasting demonstrations as part of hazardous weather testbeds. I also do storm surge modeling to investigate the predictability of storm surge. For the DTC, I serve as the NCAR co-lead for the Unified Post Processor (UPP) package. In recent years I’ve been working on projects that use container technology to create portable numerical weather prediction systems.

What do you find most rewarding with your work? Maybe not the most rewarding, but certainly satisfying, is finding a bug in the code and fixing it.  It’s fun to work on challenging projects that lead to new interests or spark someone else’s interest in a topic. I also like puzzles and the mechanics of things, so it feels good to streamline, automate, or improve a project to benefit someone else (e.g., user success stories).  I love to work and collaborate with extraordinarily intellectual colleagues, and knowing my work will advance science.

What did you want to be when growing up? A meteorologist. I always loved the weather. The Wizard of Oz was my favorite movie as a little kid. I wanted to be Dorothy because she got to see a tornado, so my dad called me “Dot.” 

What do you like to do in your spare time (hobbies, interests)? I had a baby last year (boy, Liam), so he keeps me plenty busy.  I enjoy taking him hiking and camping with my husband. When I can find some extra spare minutes for myself, I enjoy mountain biking, puzzles, reading, and a good cup of coffee.

Where would you like to travel, and why? Belgium - to taste the beer, eat the frites, and watch cyclocross races.

Kate Fossell

 


Visitors

Using Machine Learning to Post-Process Ensemble-based Precipitation Forecasts

Visitor: Eric Loken
Contributed by Eric Loken

Ensembles are useful forecast tools because they account for uncertainties in initial conditions, lateral boundary conditions, and/or model physics, and they provide probabilistic information to users. However, many ensembles suffer from under-dispersion, sub-optimal reliability, and systematic biases. Machine learning (ML) can help remedy these shortcomings by post-processing raw ensemble output. Conceptually, ML identifies (nonlinear and linear) patterns in historical numerical weather prediction (NWP) data during training and uses those patterns to make predictions about the future. My work seeks to answer questions related to the operational implementation of ML for post-processing ensemble-based probabilistic quantitative precipitation forecasts (PQPFs). These questions include how ML-based post-processing impacts different types of ensembles, compares to other post-processing techniques, performs at various precipitation thresholds, and functions with different amounts of training data. 

During the first part of my visit, my work has used a random forest (RF) algorithm to create 24-h PQPFs from two multi-physics, multi-model ensembles: the 8-member convection-allowing High-Resolution Ensemble Forecast System Version 2 (HREFv2) and the 26-member convection-parameterizing Short-Range Ensemble Forecast System (SREF). RF-based PQPFs from each ensemble are compared against raw ensemble and spatially-smoothed PQPFs for a 496-day dataset spanning April 2017 – November 2018. 

Preliminary results suggest that RF-based PQPFs are more accurate when compared to the raw and smoothed ensemble forecasts (Fig. 1). An example set of forecasts for the 1-inch threshold is shown in Fig. 2. Notably, the RF PQPFs have nearly perfect reliability without sacrificing resolution, as sometimes occurs with spatial smoothing (e.g., Fig. 2b). The RF technique performs best for the SREF, presumably because it has more systematic biases than the HREFv2, and for lower precipitation thresholds, since there are more examples of observations exceeding these thresholds (i.e., the RF has more positive training examples to work with).    

Figure 1 (a) Brier Skill Score (BSS) for the raw (purple), spatially smoothed (blue), and RF-based (red) ensemble PQPFs for the 1-inch threshold. (b) As in (a) but for the reliability component of the Brier Score (BS). (c) As in (a) but for the resolution component of the BS.
Figure 2 - Probability of 1-inch threshold exceedance from the SREF-based raw (a), spatially smoothed (b), and (c) RF-based forecasts. The black contour denotes where 1-inch precipitation was observed. (d) - (f) As in (a)-(c) but for HREFv2-derived forecasts.

 

Once an RF has undergone training, it is computationally inexpensive to run in real-time. After data preprocessing, a real-time forecast can be generated in less than a minute on a single processor. Including preprocessing, the forecast takes about 30 minutes to generate. Real-time RF PQPFs are currently being produced for the 00Z HREFv2 initialization and can be accessed at https://www.spc.noaa.gov/exper/href/ under the precipitation tab.

Future work will add temporal resolution to the ML-based forecasts and will compare the benefits of ML-based post-processing for formally-designed ensembles, whose members use the same physical parameterizations and produce equally-likely solutions (e.g., the NCAR ensemble), and informally-designed ensembles, whose members use different physical parameterizations and produce unequally-likely solutions (e.g., the Storm Scale Ensemble of Opportunity). I am grateful to the DTC Visitor Program for supporting this work. 

Eric Loken

 


Community Connections

Container & CCPP AMS Short Courses

Contributed by Kate Fossell and Grant Firl

One of the primary goals of the DTC is to provide software and infrastructure that aid in transitions between the research and operational communities.  The American Meteorology Society (AMS) provides an ideal venue for sharing these tools with the community through the AMS short course offerings at the Annual Meeting.  The DTC is looking forward to presenting two short courses at the 100th AMS Annual Meeting in Boston, MA in January 2020. 

“Integrating Numerical Weather Prediction (NWP) System Components Using Container Technology and Cloud Services”

The goal of this course is to raise awareness about tools and facilities available to the community for testing and evaluating Numerical Weather Prediction (NWP) innovations, including the emerging set of software tools in reusable containers and cloud computing resources, through hands-on learning. Containerized software is used to bundle all operating systems, code, library dependencies, and executables needed to both build and run software packages on any computing environment.  The DTC has leveraged this technology to create a portable end-to-end system comprised of various NWP components such as the Weather Research and Forecasting model (WRF), Gridpoint Statistical Interpolation (GSI) data assimilation system, Unified Post Processor (UPP), and Model Evaluation Tools (MET). These can be executed on any platform, including in the cloud, without the typical upfront time and frustration of building the software packages from the ground up. The course will introduce the concept of containerized software, provide an overview of the NWP components available from the DTC, and offer a hands-on tutorial that will allow participants to use the containers to complete case study examples using cloud services.  While this course may appeal to a wide-reaching audience, this information may be particularly useful to undergraduate and graduate students interested in learning more about NWP and to university faculty that may find software containers and cloud computing to be useful teaching tools to add to their course curriculum. The online tutorial for the end-to-end NWP containers is publicly available at: https://dtcenter.org/community-code/numerical-weather-prediction-nwp-containers/tutorial-version-3.

“Experimentation and Development of Physical Parameterizations for Numerical Weather Prediction Using a Single-Column Model and the Common Community Physics Package (CCPP)”

This half-day course will teach participants how to develop and experiment with physics parameterizations within the CCPP framework. CCPP is the mechanism adopted by NOAA to drive atmospheric physics within Unified Forecast System (UFS) applications. NCAR also plans to use CCPP in their modeling systems (e.g., WRF, MPAS, CESM). A single-column model will be used to demonstrate how the CCPP framework works and to expose participants to physics suites available in the CCPP. The use of prepared, observationally-based cases combined with this tool’s computational simplicity will allow participants to grasp relevant concepts related to the CCPP and to conduct basic experiments. Graduate students, physics developers and researchers, as well as those with a general interest in working within NOAA frameworks could benefit from attending this course.

AMS Short Course
100th AMS Meeting

 


Did you know?

METplus

Expanding Verification Capabilities Beyond the Basics
Contributed by Tara Jensen

 

The Model Evaluation Tools (MET) were developed and released to the community by the Developmental Testbed Center more than a decade ago.  At that time, the tools included computation of traditional statistics for continuous fields (temperature, pressure, and height) and dichotomous fields (precipitation, clouds, fog, and high impact events). MET also included a spatial verification method, using objects, which complemented traditional statistics.  

Over the past five years, new flexibility and diagnostic options expanded MET capabilities.  Support for calling a Python script from within MET was added to give researchers more control over the fields supported by MET and to allow for exploration of new methods.  Additionally, a suite of Python wrappers now provides low-level automation for verification tasks. The resulting enhanced MET system is called METplus and is being adopted by DTC partners that want to tap into the power of an extensive toolset and unified verification capability.  

Upcoming enhancements are driven by many projects within the DTC and through community contributions:  

  • More process-oriented diagnostics ranging from microphysical scales to sub-seasonal to seasonal scales.  
  • A renewed focus on enhancing MET-TC (Tropical Cyclone) to provide diagnostics in “cyclone-space.”  
  • Additional capability for: evaluating high-impact weather and atmospheric composition events, the use of satellite data, and evaluating  components of a coupled prediction system. 

Look for many of these enhancements in the METplus v3.0 release in late fall 2019.

 


Announcement

AMS 2020

100th AMS Annual Meeting in historic Boston, Massachusetts, from 12 to 16 January 2020

Join us at one of the many DTC-related presentations and posters at the 100th AMS Annual Meeting in historic Boston, Massachusetts, from 12 to 16 January 2020 in the Boston Convention and Exhibition Center.

 

_____________

Link to search 2020 AMS Annual meeting for DTC and Developmental Testbed Center: https://ams.confex.com/ams/2020Annual/meetingapp.cgi/Search/0?sort=SortableTimeString&size=50&page=1&searchterm=%22DTC%22%20%22Developmental%20Testbed%20Center%22

_____________

List of search results for abstracts that include DTC and Developmental Testbed Center. All room and event locations listed below are held in the Boston Convention and Exhibition Center.

PD1.5 Using Cloud Computing and Software Container Technology for Interactive Classroom Learning in Numerical Weather Prediction Michael J. Kavulich Jr., NCAR, Boulder, CO; and J. K. Wolff, K. Fossell, J. Halley Gotway, M. Harold, and S. Ng
Monday, January 13, 2020
08:30 AM - 08:30 AM
258C
 

1B.6 The Use of the METplus Verification and Diagnostic Capability in Forecast Evaluation across Multiple Scales and Applications Tara Jensen, NCAR, Boulder, CO; and J. Halley Gotway, G. P. McCabe Jr., J. Frimel, M. P. Row, R. G. Bullock, T. L. Fowler, D. W. Fillmore, B. Strong, M. Marquis, M. Win-Gildenmeister, J. Prestopnik, D. R. Adriaansen, and C. P. Kalb
Monday, January 13, 2020
09:45 AM - 10:00 AM
257AB

2.5 Exploring Non-Traditional Methods for Streamlining the Model Validation Process Tracy Hertneky, and T. L. Fowler and M. Harrold
Monday, January 13, 2020
11:30 AM - 11:45 AM
260

3.1 Streamlining Verification through the Enhanced Model Evaluation Tools (METplus) Tara Jensen, NCAR, Boulder, CO; and J. H. Gotway, M. P. Row, B. Strong, J. Frimel, J. J. Levit, M. Win-Gildenmeister, and M. Marquis
Monday, January 13, 2020
02:00 PM - 02:30 PM
260

The Use of METplus Verification and Diagnostic Capabilities for Evaluating Sea-Ice Predictions Lindsay R. Blank, NCAR, Boulder, CO; Developmental Testbed Center, Boulder, CO; and R. Grumbine, T. Jensen, and J. J. Levit
Monday, January 13, 2020
04:00 PM - 06:00 PM
Hall B

Adding tropical cyclone genesis verification capabilities to the Model Evaluation Tools (MET+) Daniel J. Halperin, Embry-Riddle Aeronautical Univ., Daytona Beach, FL; and K. M. Newman, J. E. Halley Gotway, and T. L. Jensen
Monday, January 13, 2020
04:00 PM - 06:00 PM
Hall B

5A.3 A Community Workflow for the Stand-Alone Regional (SAR) Configuration of the FV3 Gerard Ketefian, NOAA/ESRL/GSD and Univ. of Colorado/CIRES, Boulder, CO; and J. Beck, C. Alexander, L. Reames, G. Gayno, D. Heinzeller, L. Pan, T. Smirnova, J. Purser, D. Jovic, T. Black, J. Abeles, J. Wolff, L. Carson, J. Schramm, M. J. Kavulich Jr., J. R. Carley, and B. T. Blake
Tuesday, January 14, 2020
11:00 AM - 11:15 AM
252A

5.5 The Model Evaluation Tools (MET): Recent Additions and Enhancements John E. Halley Gotway, NCAR, Boulder, CO; and T. L. Jensen, R. G. Bullock, H. Soh, D. W. Fillmore, and J. Prestopnik
Tuesday, January 14, 2020
11:30 AM - 12:00 PM
260

3.1 Predictive Skill of African Easterly Waves in the ECMWF Subseasonal-to-Seasonal Reforecasts Weiwei Li, NCAR, Boulder, CO; Developmental Testbed Center, Boulder, CO; and Z. Wang
Tuesday, January 14, 2020
01:30 PM - 01:45 PM
212

6A.2 Combining the Common Community Physics Package with a Single Column Model to Drive NWP Physics Advancements Grant J. Firl, NCAR, and Developmental Testbed Center, Boulder, CO; and D. Heinzeller, L. Xue, and L. Bernardet
Tuesday, January 14, 2020
03:15 PM - 03:30 PM
257AB

3.4 Leveraging Cloud Computing and Software Container Technologies to Create a Portable End-to-End Numerical Weather Prediction System Kate Fossell, NCAR, Boulder, CO; and J. Wolff, J. H. Gotway, M. Harrold, and M. J. Kavulich Jr.
Tuesday, January 14, 2020
03:45 PM - 04:00 PM
155

One-Stop Shopping for Physics across Scales: From a Single-Column Model to Three-Dimensional Configurations for Weather and S2S Linlin Pan, NOAA/GSD, Univ. of Colorado/CIRES, and Developmental Testbed Center, Boulder, CO; NOAA, Boulder, CO; and L. Bernardet, D. Heinzeller, E. Kalina, G. Firl, E. Grell, K. Newman, L. Carson, and G. Grell
Tuesday, January 14, 2020
04:00 PM - 06:00 PM
Hall B

An Evaluation of Common Community Physics Package (CCPP) Physics Suites Across Scales Kathryn M. Newman, NCAR, Boulder, CO; and T. J. Hertneky, E. A. Kalina, M. Harrold, L. Pan, G. Firl, E. D. Grell, L. Carson, and M. Ek
Tuesday, January 14, 2020
04:00 PM - 06:00 PM
Hall B

Verification of the Physics Suite Testing for GFS v16 Using the Model Evaluation Tools Michelle Harrold, NCAR, Boulder, CO; and J. K. Wolff, M. Zhang, T. Hertneky, L. Bernardet, J. K. Henderson, L. R. Blank, W. Li, L. Pan, G. Firl, and T. Jensen
Tuesday, January 14, 2020
04:00 PM - 06:00 PM
Hall B

6.2 Exposing Undergraduate Students to Numerical Weather Prediction Through the Use of Software Containers and Cloud Computing Jamie K. Wolff, and S. Ng, K. R. Fossell, J. E. Halley Gotway, M. Harrold, and M. J. Kavulich Jr.
Wednesday, January 15, 2020
10:45 AM - 11:00 AM
258C

9.3 Assimilation of Dual-Pol Quality-Controlled Radial Velocity Data in the NOAA Operational Convective-Scale Forecast System
Guoqing Ge, CIRES and NOAA/ESRL/GSD, Boulder, CO; and M. Hu, S. Weygandt, and C. Alexander
Wednesday, January 15, 2020
11:15 AM - 11:30 AM
259A

9.5 The Use of the METplus Verification and Diagnostic Capability in Short-Term Forecast Evaluation Tara Jensen, NCAR, Boulder, CO; and J. Halley Gotway, C. P. Kalb, L. R. Blank, D. R. Adriaansen, and D. W. Fillmore
Wednesday, January 15, 2020
11:30 AM - 11:45 AM
252A

10B.2 A Community Effort to Unify Verification and Validation Efforts Tara Jensen, NCAR, Boulder, CO; and G. Manikin, J. A. Otkin, I. Stajner, and Z. Wang
Wednesday, January 15, 2020
01:45 PM - 02:00 PM
251

10B.3 Fostering National and International Collaboration through the Enhanced Model Evaluation Tools (METplus) Tara Jensen, NCAR, Boulder, CO; and J. Halley Gotway, M. P. Row, J. J. Levit, B. Strong, and M. Marquis
Wednesday, January 15, 2020
02:00 PM - 02:15 PM
251

11B.1 Developmental Testbed Center: Current Status and Outlook for the Future Louisa B. Nance, NCAR, Boulder, CO; and J. Beck, L. Bernardet, G. Firl, K. Fossell, M. Harrold, M. Hu, T. L. Jensen, E. Kalina, M. Marquis, K. Newman, J. K. Wolff, K. Y. Wong, and C. Zhou
Wednesday, January 15, 2020
03:00 PM - 03:15 PM
251

11B.2 The Common Community Physics Package CCPP: Unifying Physics across NOAA and NCAR Models using a Common Software Framework Dom Heinzeller, NOAA/ESRL/GSD, and Univ. of Colorado/CIRES, and Developmental Testbed Center, Boulder, CO; and G. J. Firl, L. Bernardet, L. Carson, M. Zhang, S. Goldhaber, C. Craig, D. Gill, M. Duda, and F. M. Vitt
Wednesday, January 15, 2020
03:15 PM - 03:30 PM
251

Testing the DTC’s Single Column Model for Tropical Cyclone Environment Mrinal K. Biswas, NCAR and Developmental Testbed Center, Boulder, CO; and G. Firl, M. Ek, and J. Zhang
Wednesday, January 15, 2020
04:00 PM - 06:00 PM
Hall B

The Python-based MPMC Test Suite for NOAA Operational Data Assimilation Systems (GSI/EnKF) G. Ge, CIRES and NOAA/ESRL/GSD, Boulder, CO; and M. Hu, C. Zhou, and D. Stark
Wednesday, January 15, 2020
04:00 PM - 06:00 PM
Hall B

Enhancements to Cloud Overlap Radiative Effects for Weather Forecasting and Tropical Cyclone Prediction Michael J. Iacono, AER, Lexington, MA; and J. M. Henderson, L. Bernardet, E. Kalina, M. K. Biswas, K. M. Newman, B. Liu, Z. Zhang, and Y. T. Hou
Wednesday, January 15, 2020
04:00 PM - 06:00 PM
Hall B

Track Centered Moving Grids for Tropical Cyclone Forecast Assessment in the Model Evaluation Tools (MET) Verification Package David W. Fillmore, 3090 Center Green Dr., Boulder, CO; NCAR, Boulder, CO; and T. J. Hertneky, K. M. Newman, E. A. Kalina, R. G. Bullock, M. K. Biswas, J. E. Halley Gotway, and T. L. Jensen
Wednesday, January 15, 2020
04:00 PM - 06:00 PM
Hall B

12A.3 Physics Interoperability as a Strategy for Advancing NOAA’s Unified Forecast System Physics Suites Ligia Bernardet, CIRES/Univ. of Colorado, NOAA/GSD and Developmental Testbed Center, Boulder, CO; and G. J. Firl, D. Heinzeller, L. Carson, M. Zhang, J. Schramm, and L. Nance
Thursday, January 16, 2020
11:00 AM - 11:15 AM
257AB

12A.5 Process-oriented Diagnostics to Inform the Physics Suite of Future GFS Implementations using NOAA's Unified Forecast System Weiwei Li, NCAR, Boulder, CO; and L. Bernardet, M. Zhang, L. Pan, M. Harrold, J. Wolff, J. K. Henderson, T. Hertneky, L. R. Blank, G. J. Firl, M. ek, J. Dudhia, T. Jensen, Z. Wang, and L. Nance
Thursday, January 16, 2020
11:30 AM - 11:45 AM
257AB

13B.6 Advances Toward an Operational Convection-Allowing Ensemble Prediction System in the Unified Forecast System at NOAA J. R. Carley, NOAA, College Park, MD; and B. T. Blake, T. L. Black, E. Rogers, E. Aligo, J. Abeles, L. C. Dawson, T. Lei, Y. Lin, M. E. Pyle, P. Shafran, E. Strobach, X. Zhang, J. S. Kain, C. R. Alexander, L. J. Wicker, L. M. Harris, and J. K. Wolff
Thursday, January 16, 2020
02:45 PM - 03:00 PM
257AB

 


AGU 2019

AGU Fall Meeting 2019

Join us for the DTC presentation and posters at the 100th AGU Fall Meeting in San Francisco, from 9 to 13 December 2019 in the Moscone Center.

A11V-2749 An Evaluation of Common Community Physics Package (CCPP) Physics Suites Across Scales
Kathryn Newman, Tracy Hertneky, Evan Anthony Kalina, Michelle Harrold, Linlin Pan, Evelyn D. Grell, Laurie Carson and Michael B Ek

Monday, 9 December 2019
08:00 - 12:20
Moscone South - Poster Hall

A13I-3018 The Use of the Enhanced Model Evaluation Tools (METplus) Verification and Diagnostic Capability in S2S
Tara L Jensen, John Halley-Gotway, Weiwei Li, Zhuo Wang, Juliana Dias and George N Kiladis

Monday, 9 December 2019
13:40 - 18:00
Moscone South - Poster Hall

A31M-2879 Evaluate the Impact of Assimilating Dual-Polarized Quality Controlled Radial Velocity on Storm Forecast with an Operational Functional Similar HRRR System
Ming Hu, Chunhua Zhou, Guoqing Ge, Ying Zhang, Michael Kavulich Jr., and Lindsay Blank

Wednesday, 11 December 2019
08:00 - 12:20
Moscone South - Poster Hall

A34E-08 Subseasonal Variability of Rossby Wave Breaking and Impacts on Tropical Cyclones during the North Atlantic Warm Season
Weiwei Li, Zhuo Wang, Gan Zhang, Melinda Peng, Stanley G Benjamin, and Ming Zhao,

Wednesday, 11 December 2019
17:45 - 18:00
Moscone West - 3000, L3

 


PROUD Awards

HOWARD SOH, Software Engineer III, NSF NCAR RAL & DTC |

Howard Soh is a Software Engineer III with the NSF NCAR Research Applications Laboratory who contributes to the development and support of the Model Evaluation Tools (MET) statistical component of the METplus system at the DTC.

Howard splits his time between the National Security Applications Program (NSAP) and the Joint Numerical Testbed (JNT), but his contributions to METplus far exceed expectations for that allocation. He is a fully engaged and active member of the METplus team and is a leader in several critical aspects of MET: SonarQube static code analysis, Python embedding, NetCDF APIs, point observation data ingest, and adding support for unstructured grids. Howard also monitors the METplus Discussions forum, where users submit questions for help, and provides his expertise when needed. He often answers the questions via email or Slack well before the person monitoring Discussions for that day has even read the question!

The DTC is proud to recognize Howard for consistently demonstrating excellent technical ability, initiative, communication, and leadership skills with all of his team members. He is not only a talented software engineer on the METplus team, but is also eager to lead new development tasks, takes a proactive approach in supporting customers, is friendly and approachable, and is always happy to help the team.

Thank you, Howard, for all the work you do for your DTC colleagues, partners, and customers!

,