Ruby and Sapphire: new modeling strategy at MPI-M

The development of Earth-system models at the Max Planck Institute for Meteorology (MPI-M) has shown that models with computational grids on a kilometer scale are feasible. These models use grids that are fine enough to resolve the transient dynamics of any atmospheric disturbances – from a Cumulus rain shower to a tropical storm. For this reason, these models are referred to as Storm-Resolving or convection-resolving Earth-System Models (SR-ESMs). With these fine grids, the SR-ESMs also explicitly represent other important climate processes, such as mesoscale eddies in the ocean, mesoscale land surface heterogeneity and the influence of topography on large-scale atmospheric circulation and water-mass formation in the ocean. Thus, the SR-ESMs are structurally and qualitatively different from the highest resolution conventional climate models and allow to reach new scientific horizons. The MPI-M aims to take a leading role in the development of this new type of Earth-system models; this leads to a fundamental restructuring of the approach to model development.

The new model-development strategy focuses on developing the next generation of Earth- system models, the SR-ESMs. At the same time, the capabilities of traditional Earth-system models will be better exploited to study climate variability and the effects of slow processes on the climate system. The ICON model system, which obtains its name from the usage of spherical grids derived from the icosahedron (ICO) and the non-hydrostatic (N) dynamics used for the atmosphere, is the primary model system at MPI-M. It includes components for the atmosphere, the ocean, and the land. ICON is developed in partnership with the German Meteorological Service (DWD), the German Climate Computing Center (DKRZ) and the Karlsruhe Institute for Technology (KIT).

To handle the development challenges of the SR-ESM, MPI-M has divided its model development into two threads: Ruby and Sapphire. Sapphire focuses on the development and application of next-generation Earth-system models (SR-ESM); Ruby emphasizes the application and efficient utilization of ICON-based conventional models at MPI-M. Sapphire is closely coordinated with the technical efforts to adapt the ICON model to the requirements of the most powerful computer systems for scientific computing and the new workflows these entail.

Alternating bi-weekly meetings are held at MPI-M to organize and manage the Sapphire and Ruby threads. For Sapphire's experiment-driven approach, project management focuses on the successful execution and exploitation of the experiments. The focus is devoted to defining experiments, monitoring progress, identifying steps required to solve problems as they arise, revising the experiment definition, and finally nurturing the scientific exploitation of concluded experiments. The process makes the assignment of technical and computational resources transparent, and provides a forum where any scientist can propose an innovative and new experiment. In order to avoid fragmentation, each experiment is concluded by a retrospective, summarizing the knowledge gained and possible technical developments that could support a more agile modeling system. In response to these retrospectives, new experiments may be defined, ideally in a way that also address scientific goals. The experiment-driven model development helps Sapphire maintain a certain scientific focus and has resulted in a vibrant collaboration in model development. Ruby also borrows elements of the experiment-driven strategy, but some aspects, such as the optimization of the performance of a system for general purpose use, do not lend themselves as readily to this strategy. Nonetheless, also in the case of Ruby, the open and regular development meetings track progress and help with the effective and transparent management of human and computational resources.
 

Ruby
Ruby develops Earth-system models to study climate variability, climate predictability, and climate change, with a code base consistent with that of our high-resolution aspirations in Sapphire. Ruby bundles MPI-M-wide activities applying Earth-system models in climate mode. Furthermore, Ruby supports the preparation of specific experimental setups for particular scientific purposes. To be suitable for the required long simulations, the Ruby family of model setups requires intensive efforts to assure good conservation properties (energy, water, carbon) and careful model tuning.The range of model set-ups extends from very low resolution for training purposes and multi-millennial paleo studies to high-end climate models with eddy-resolving ocean and an atmosphere resolution previously only applied in regional models, thereby closing the gap to the even higher-resolution Sapphire set-ups.

The basis of Ruby is the Ruby-0 configuration. The DECK-experiments performed with this configuration in the Climate Model Intercomparison Project Phase 6 (CMIP6) show that a completely new ICON-based ESM is ready for further experiment-driven developments with specific science questions. Ruby-1 and Ruby-2 are two examples for such developments. The ultimate goal of Ruby is to turn the conventional coupled configurations into an SR-ESM suitable for climate simulations.

Ruby Configurations
Ruby-0 offers an efficient configuration for model development, for past, present and future climate simulations, and for large ensembles. While model development has so far mostly happened at the level of individual components (Crueger et al., 2018), work with the new model strategy focuses on the coupled system, whose tuning poses additional challenges (Mauritsen and Roeckner, 2020). Ruby-0, after extensive tuning work, generally performs well compared to observations and the older MPI-ESM (Jungclaus et al., 2020) (Fig. 1). Less satisfactory features are underway to ameliorate. For example, Northern Hemisphere sea-ice distribution is expected to improve with the newly developed dynamical sea-ice model, where the dynamics are formulated on ICON’s native triangular C-grid (Mehlmann and Korn, 2020).

Fig. 1: Standardized climatological errors of two ICON-ESM Ruby-0 simulations of the pre-industrial control state (blue/light blue) and two CMIP5-MPI-ESM1 simulations (red/brown). Shown are the errors compared to a CMIP6-MPI-ESM1.2 simulation of the pre-industrial climate. A value smaller than 1 means a smaller error than in MPI-ESM1.2. The errors were averaged over a set of standard variables for the regions Global (GL), Northern Extra-Tropics (NE), Southern Extra-Tropics (SE) and Tropics (TR).

 

Ruby-1 aims at simulations for seasonal to decadal predictions. Based on the achievements of the simulations in the MiKlip-project to improve forecasts (Pohlmann et al., 2019), a higher-resolution version of ICON-ESM is being developed that balances between needs for enhanced resolution and the necessary computational speed for obtaining large numbers of backward-looking forecasts (hindcasts) or predictions. Initial experiments exhibit promising results, but require additional fine-tuning. The vertical resolution in the Ruby-1 atmosphere is increased to 127 levels to ensure a proper representation of atmospheric features such as the Quasi Biennial Oscillation (QBO). Initialization of the prediction system with ocean observations is presently being implemented by DWD.

Ruby-2 aims at quantifying the role of air-sea interaction processes for climate and climate variability modes. Its hallmark is a new coupled configuration consisting of a thin surface layer between a high-resolution ICON-A (atmosphere) and a high-resolution ICON-O (ocean) using the newly developed z*-coordinate in ICON-O. Climate variability has previously been studied using high-resolution coupled models, but so far without simulation of air-sea processes by implementing a sufficiently thin ocean surface layer to solve high-frequency air-sea interactions that influence the formation of fronts. With such a configuration, the surface ocean can be warmed up easily, thereby allowing, together with the high resolution in ICON-A and ICON-O and the high coupling frequency, air-sea processes to be simulated in unprecedented high quality. In contrast to Sapphire experiments, which are restricted to short simulations, the configuration developed in Ruby-2 is used for century-scale simulations. Ruby-2 investigates how details of the high-quality air-sea processes impact long-term climate variability modes.

Ruby-SWITCH (Simplified Workflow for Icon TeaCHing) has advanced the ICON system to full Earth system capabilities by implementing the complete and interactive carbon cycle. As a result, and because of ICON's flexibility for arbitrary continental configuration, a set-up was designed for a PhD project that investigates the greenhouse gas-driven deglaciation of the snowball Earth state at the end of the Marinoan (about 635 million years ago). Ruby-SWITCH’s standard configuration for present day climate serves for education in the Earth System Model Summer School (EaSyMS) at MPI-M.
 

Sapphire
Sapphire develops an ICON version that explicitly resolves the main types of energy transport, namely horizontal energy transport from the equator to the poles, and transport from the surface to the atmosphere or into the ocean. This requires a computing grid spacing of a few kilometers to capture deep convective eddies in the atmosphere and mesoscale eddies in the ocean. The goal is to achieve hundred-meter-scale modeling to resolve shallow convection and submesoscale ocean eddies. While storm-resolving models (SR-ESMs) are now well established for regional climate simulations and weather forecasts, their global application is only covered by a handful of modeling groups worldwide. The conducted thirteen experiments, listed in the table below, established MPI-M as a leader in this field. Global ICON runs with a grid spacing of 2.5 km were successfully performed for 40 days, with the highest grid spacing of the nine SR-ESMs participating in the DYAMOND intercomparison (Stevens et al., 2019). For the first time, a global coupled simulation with 5 km horizontal resolution was run over several months of simulation time.

Through the experiment-driven strategy, the ICON version of the MPI-M, initially designed for Ruby resolution, was successively adapted to Sapphire needs. This included the implementation of new microphysics schemes, a 3D-turbulence scheme — the latter one especially for large-eddy resolution simulations —, a telescoping grid and a new vertical coordinate z* in the ocean allowing thin layers. On the technical side, a major achievement was the rewriting of the ICON code for the graphics processing unit (GPU) architecture. In order to better illustrate the experiment-driven strategy and the results achieved, the two experiments, DYAMOND++ and SMT (Submesoscale Telescope), are explained in more detail.

Table: List of experiments with its scientific and technical (in italic) leads. Except for HOLOCENE and DYAMOND, which used the physics from the German Meteorological Service (DWD), all experiments use the new Sapphire physics. It is identical to Ruby physics, but with parameterizations of gravity waves, orography-drag, cloud cover and convection turned off and using a different microphysical scheme.

 

The goal of DYAMOND++ was to conduct a coupled global storm-resolving simulation for a couple of months, something that had never been done before. It built upon the DYAMOND uncoupled experiment, where the I/O of ICON had to be significantly improved to enable global storm-resolving resolution. DYAMOND++ included significant code development: implementation of a new microphysics scheme, a new vertical mixing and shortwave absorption scheme in the ocean, a revised treatment of river runoff, and a shortening of the initialization phase from 30 to 5 minutes. The final throughput was 30 simulated days per day on 420 computing nodes (out of 1420 nodes) on DKRZ's supercomputer "Mistral". The initial state of the ocean, which requires many years to spin up, was obtained from an existing 6.5 year-long uncoupled ocean run driven by ERA-5 data with 10 km horizontal resolution.

DYAMOND++ fulfilled its goals. The expected large-scale features of the climate system, as shown in Fig. 2 for SST and precipitation, are reproduced, and first analyses revealed a more realistic localization of the ITCZ. Remaining issues were noted in the experiment’s retrospective: numerical instabilities developing at the top of the atmospheric model region, solved in the follow-up experiment DYAMOND-winter, and an unrealistically strong warming of the subtropical ocean. The latter is treated as a scientific problem as it pertains to the understanding of the interactions between radiation, convection, the boundary layer, and the surface.

Fig. 2: Mean precipitation (shading) and SST (contour) from DYAMOND++.

The goal of the SMT experiment was twofold. Firstly, to carry out a global ocean experiment with local refinement up to a resolution of several hundred meters, that is out of reach for uniform resolution configurations the foreseeable future, thereby demonstrating the power of ICON-O. Secondly, to contribute to the understanding of submesoscale dynamics and its role in global circulation. While ICON-O worked more or less 'out-of-the-box' with minimal tuning, the output and data storage required technical improvements. The final performance was eight simulated days per day at 336 nodes on "Mistral" with a two-hourly output. SMT is succeeded by the SMT-Wave experiment, which studies the interaction between eddies and waves. SMT-Wave benefits from model improvements such as the implementation of the new vertical z*-coordinate, which allows higher vertical resolution at the surface, and the incorporation of tides into ICON.

 

Publications

Crueger, T. et al. (2018) ICON-A: the atmospheric component of the ICON Earth System Model. Part II: Model evaluation. In: Journal of Advances in Modeling Earth Systems 10, pp. 1638_1662. doi: 10.1029/2017MS001233.

Hohenegger, C., L. Kornblueh, D. Klocke, T. Becker, G. Cioni, J. F. Engels, U. Schulzweida and B. Stevens (2020) Climate statistics in global simulations of the atmosphere, from 80 to 2.5 km grid spacing. J. Meteorol. Society Japan, 98, 73-91, doi:10.2151/jmsj.2020-005.

Jungclaus, J. et al. (2020) ICON-ESM: structure and performance of a climate model based on unstructured grids. Manuscript in preparation.

Mauritsen, T. and E. Roeckner (2020) Tuning the MPI-ESM1.2 global climate model to improve the match with instrumental record warming by lowering its climate sensitivity. In: Journal of Advances in Modeling Earth Systems 12, e2019MS002037. doi: 10.1029/2019MS002037.

Mehlmann, C. and P. Korn (2020) Sea-ice dynamics on triangular grids. In: Journal of Computational Physics. In review.

Pohlmann, H., W. A. Müller, M. Bittner, S. Hettrich, K. Modali, K. Pankatz and J. Marotzke (2019) Realistic quasi-biennial oscillation variability in historical and decadal hindcast simulations using CMIP6 forcing. Geophys. Res. Lett., 46, 4118-14125. doi:10.1029/2019GL084878

Stevens et al. (2020) The added value of large-eddy and storm-resolving models for simulating clouds and precipitation. Journal of the Meteorological Society of Japan, 98, 395-435. doi:10.2151/jmsj.2020-021.

Stevens, B., M. Satoh, L. Auger, et al. (2019) Dyamond: the dynamics of the atmospheric general circulation modeled on non-hydrostatic domains. Prog. in Earth and Planet. Sci., 6, 61. doi:10.1186/s40645-019-0304-z.

 

More information

Focus text about DYAMOND:
https://mpimet.mpg.de/en/communication/focus-on/dyamond-next-generation-climate-models

MiKlip project:
https://mpimet.mpg.de/en/science/projects/miklip-projekt

 

Contact

Prof Dr Bjorn Stevens
Max Planck Institute for Meteorology
Phone: ++49 (0) 40 41173 422 (Assistant Angela Gruber)
Email: bjorn.stevens@we dont want spammpimet.mpg.de

Dr Johann Jungclaus
Max Planck Institute for Meteorology
Phone: ++49 (0) 40 41173 109
Email: johann.jungclaus@we dont want spammpimet.mpg.de

Dr Christian Reick
Max Planck Institute for Meteorology
Phone: ++49 (0) 40 41173 117
Email: christian.reick@we dont want spammpimet.mpg.de

Dr Cathy Hohenegger
Max Planck Institute for Meteorology
Phone: ++49 (0) 40 41173 302
Email: cathy.hohenegger@we dont want spammpimet.mpg.de

Dr Peter Korn
Max Planck Institute for Meteorology
Phone: ++49 (0) 40 41173 470
Email: peter.korn@we dont want spammpimet.mpg.de