Transitions Newsletter Header

Bridges to Operations: Technical Aspects of Generalizing the UFS to use Multiple Dynamical Cores

Winter 2024

Following the scientific and programmatic discussions surrounding a potential shift of the Rapid Refresh Forecast System (RRFS) toward using the Model for Prediction Across Scales (MPAS) dynamical core (Carley et al. 2024), a Tiger Team was formed to scope out the technical work needed to add a second dynamical core (dycore) to the Unified Forecast System (UFS).  The Tiger Team focused on a two-pronged approach: scope out the inclusion of a generic new dycore in the UFS, and focus the majority of its work on the MPAS dycore. Similarly, since the drive for a new dycore comes from the RRFS team, the Tiger Team kept in mind the use of the MPAS dycore for all UFS Apps, while focusing primarily on the UFS Short Range Weather (SRW) App.

The Tiger Team collected input from UFS App leadership teams and from NSF NCAR. The connection with both NSF NCAR Mesoscale and Microscale Modeling (MMM) and Climate and Global Dynamics (CGD) Laboratories is relevant because MMM develops and uses the MPAS model, while CGD uses the MPAS atmospheric component dycore in the Community Atmospheric Model, the atmospheric component of the Community Earth System Model (CAM/CESM). The solution proposed by the Tiger Team is similar to the one used by CGD. The vision is to use the MPAS dycore in the UFS without using the entire MPAS code available on Github, which comes with additional components such as a driver, a framework, and additional utilities. This arrangement will allow the UFS to retain core parts of its infrastructure, such as its workflow, connection with physics via the Common Community Physics Package (CCPP), Input/Output (I/O) strategy, post-processing capability, and product generation. While this approach is more costly initially, it will save resources in the long run, facilitate community engagement, and smooth out the path for bringing innovations into NOAA operations.

The vision is to use the MPAS dycore in the UFS without using the entire MPAS code available on Github. This way, the UFS will retain core parts of its infrastructure, such as its workflow, connection with physics via the Common Community Physics Package (CCPP), Input/Output (I/O) strategy, post-processing capability, and product generation.

The technical challenges of including the MPAS dycore in the UFS were estimated to be on the order of 13 full-time equivalent employees and can be grouped into the main areas below. Additional resources may be needed to support NSF NCAR for this collaboration and to conduct scientific testing. It should be noted that this level of effort corresponds to the initial integration of the MPAS dycore in the UFS, and does not represent the ongoing overhead of maintaining a dual dycore forecast system.

  • Generalization of the UFS atmospheric component.  Portions of code tie directly to the Finite-Volume Cubed-Sphere (FV3) dynamics, and these portions need to be generalized to support multiple dycores. The build system needs to be modified to accommodate this generalization.
  • Code management and testing. A code management plan needs to be devised jointly with NSF NCAR to manage the insertion and potential updates to the MPAS dycore. New regression tests need to be added to the UFS Weather Model to cover the new dycore.
  • Pre-processing. It will be necessary to integrate already-existing MPAS utilities to prepare initial condition and static files into the UFS. Tools to obtain/create new MPAS meshes will need to be available to the community.
  • Data assimilation. Significant work is needed to connect the Joint Effort for Data assimilation Integration (JEDI) with the UFS Weather Model and with RRFS in particular. That said, given that the JEDI interfaces are model agnostic and that the JEDI-MPAS capability already exists, there is no new cost generated by the dynamical core switch. It is assumed that no efforts will be made to integrate the MPAS dycore with the legacy Gridpoint Statistical Interpolation (GSI) data assimilation system.
  • Physics-dynamics coupling. While the CCPP offers model-agnostic interfaces, the substantial differences in physics dynamics coupling between FV3 and MPAS will demand some adjustments. Those pertain to where in the dycore the physics tendencies should be applied, differences in time-split versus process-split approaches, conversions toward the MPAS height-based coordinates, and to the development of MPAS-specific interstitial schemes. Additional effort will be needed to adapt existing stochastic processes.
  • Inter-component coupling. The MPAS National Unified Operational Prediction Capability (NUOPC) cap existing in CAM/CESM will be leveraged to expose the MPAS dycore geometry and domain decomposition in the cap of the UFS atmospheric component. Aspects of data memory allocation, run sequence, and import/export of fields will need to be addressed.
  • Input/Output and post-processing. Since MPAS outputs data on its native unstructured mesh, additional tools will be needed to convert the output to the desirable lat-lon grids. First, stand-alone tools can be used. Ultimately, to improve performance for operations, the Unified Post-Processor (UPP) and the UFS asynchronous I/O component need to be generalized to write out the desired products.
  • Workflow. The workflow(s) will have to be modified to include the MPAS-specific tasks.

For more information about this effort, readers are referred to Wang et al. (2023).

Contributed by Ligia Bernardet (NOAA GSL and DTC) and Dom Heinzeller (UCAR/JCSDA)