Skip to main content
U.S. flag

An official website of the United States government

OVERALL PERFORMANCE MEASURES

4th QUARTER METRIC COMPLETED: 

Demonstrate Fully Coupled Simulations over the Arctic with Different RRM Configurations and the Impact of High Resolution in the Refined Region

Product Definition

This report summarizes the improvements in representation of the North Atlantic and arctic climate using coupled versions of the Department of Energy’s (DOE) Energy Exascale Earth System Model (E3SM). E3SM allows for the use of regional refinement in all of its components so that high spatial resolution can be focused in specific regions of the globe while maintaining lower resolution in others. This regionally refined mesh (RRM) capability (currently existing in only one other Earth system model; Jungclaus et al. 2022) results in high-resolution model fidelity in regions of interest at a fraction of the computational cost required by a global, high-resolution model. Below, four coupled, E3SM configurations are highlighted: 1) the standard resolution configuration (Standard); 2) the Interdisciplinary Research For Arctic Coastal Environments (InteRFACE) configuration; 3) the North American RRM (NARRM) configuration; 4) the High-Latitude Applications and Testing (HiLAT) configuration. The latter three all include RRMs in one or more components to improve deficiencies present in the Standard configuration. Each of these is described in more detail in Section 2.

In Section 3, we describe the ability of E3SM’s Standard and RRM configurations to resolve important aspects of North Atlantic and arctic climate, including surface ocean currents, eddy-kinetic energy (EKE), sea surface temperature and salinity (SST and SSS, respectively), ocean mixed-layer depth (MLD), sea ice properties, and ocean stratification and velocities along observational transects.

Relative to standard quasi-uniform-resolution E3SM configurations, we show that RRM simulations demonstrate clear improvements in the North Atlantic and the Arctic. These improvements include: (1) the representation of ocean surface currents and EKE, which are critical for accurately resolving heat and salt transport; (2) reduced SST and SSS biases, which are important for accurate representation of ocean vertical mixing and the sea ice cover; (3) reduced biases in MLDs; (4) a reduction in long-standing biases in arctic sea ice concentration, particularly in the Labrador Sea region, and sea ice thickness distribution in the central Arctic; and (5) volumetric transport between the Arctic and North Atlantic ocean basins that is in much better agreement with observations.

Product Documentation

In this section the E3SM mesh configurations are described. A description of the time domain of each simulation is also given for completeness. The four main configurations that are discussed further in the document are:

  1. The standard configuration (hereafter Standard) − In the standard configuration, the ocean resolution is 30 km at the Equator and decreases smoothly to 60 km in mid-latitude regions and then increases again to 30 km in high-latitude regions. The design of the base mesh was chosen to fully resolve equatorial wave dynamics and, outside of the tropics, the resolution is such that mesoscale eddies are fully parameterized. The standard atmosphere resolution is uniform at ~100 km.
  2. The InteRFACE configuration (hereafter InteRFACE) − This configuration uses the standard atmosphere resolution, but couples it to a RRM for ocean and sea ice in which coastal North America, the Arctic, and the North Atlantic are resolved at 14 km. Away from this region, the spatial resolution smoothly transitions back to the ocean and sea ice resolution of Standard (Figure 1). This mesh was designed to address science questions (i) under E3SM, pertaining to the changing water cycle in North America, and (ii) under InteRFACE, pertaining to how changes in sea ice distributions will impact arctic shipping and economies of the region.
  3. The NARRM configuration (hereafter NARRM) − This configuration uses the same ocean and sea ice RRM as InteRFACE and is coupled to an atmospheric RRM that has been refined to 25 km over North America transitioning to the standard 100-km resolution over the rest of the globe.
  4. The HiLAT configuration (hereafter HiLAT) − This configuration uses an ocean and sea ice RRM with refinement down to 10 km in the Arctic, which smoothly transitions away from this region to the standard resolution. This is paired with an atmosphere RRM with arctic regional refinement down to 25 km, that also transitions to the standard resolution outside this region. Of the four configurations discussed herein, the HiLAT and NARRM configurations are noteworthy in that they include refinement in the ocean, sea ice, and atmosphere components, currently a new and rare capability among ESMs.

The RRM meshes were designed to answer science questions of the E3SM, InteRFACE, and HiLAT projects. The long-term goal of the Water Cycle campaign is to understand and project changes in water availability over the United States. The InteRFACE project focuses on how changes in sea ice distributions will impact arctic economies and coastal communities through changes in shipping and resource extraction. The HiLAT project focuses on changes in the arctic Earth system, in particular on processes and feedbacks contributing to the accelerated warming of the Arctic known as arctic amplification. The RRM capability has allowed us to design specific meshes to ensure that critical gateways for arctic-subarctic volume, freshwater, and heat exchanges are realistically represented and that potentially critical shipping routes across the Arctic (i.e., the Northwest Passage and the Northern Sea Routes) are explicitly resolved, at a fraction of the cost of global high resolution. As such, the RRM capability provides a key advantage compared to models at standard resolution, which commonly are limited in representing these arctic gateways and passages, leading to biases in modeled oceanographic and sea ice conditions.

To adequately address critical science questions, the RRMs discussed above must not only robustly represent critical processes within the Arctic, but also the lower-latitude processes that affect the Arctic. As an example, heat transport into the Arctic, especially from the Atlantic, exerts a strong control on sea ice thickness and extent. Thus, the Gulf Stream and North Atlantic current must be well represented to accurately capture this oceanic heat transport. An understanding of these key physical processes guided our design of the three refined meshes discussed above, which extend the regions of high spatial resolution to critical regions at lower latitudes.

The InteRFACE, NARRM, and HiLAT RRMs were designed to have near-eddy-resolving resolution and eddy-permitting resolution in the Tropics, North Atlantic, and the Arctic. This allows for an improved representation of the Gulf Stream and North Atlantic currents (e.g., Chassignet and Marshall 2008). The improved currents are clearly seen in a snapshot of the surface kinetic energy (Figure 2), where the left panel is for Standard and the right panel is from InteRFACE. For Standard, there are only one or two grid cells across the entire width of the Gulf Stream, resulting in an overly weak current that does not separate from the coast at all, inconsistent with observations. By contrast, for InteRFACE, eddy structures in the Gulf Stream region and the subpolar North Atlantic are clearly visible. In particular we note that the separation of the Gulf Stream at Cape Hatteras is consistent with observations and greatly improved over low resolution. Also, the Irminger and Labrador currents are much stronger in InteRFACE, which (in addition to contributing to the improvements discussed below) are important for the coupling between the Greenland ice sheet and the Atlantic and Arctic Oceans. The InteRFACE mesh increases the number of cells by less than a factor of two while greatly improving the simulation of these currents. 

For all configurations, mesoscale eddies are not fully resolved everywhere, and are represented through the Gent-McWilliams (GM) parameterization (Gent and McWilliams 1990). This parameterization is active in lower-resolution regions where the Rossby radius is not resolved, and inactive otherwise. For Standard, GM is fully active for the entire globe. For the arctic-refined configurations, GM is off for resolutions finer than 20 km. This is in the Arctic and subpolar North Atlantic for all configurations, and additionally in the coastal ocean surrounding North America for the InteRFACE and NARRM configurations. The strength of the GM parameterization is linearly tapered to zero as a function of resolution, allowing for the improved representation of heat and mass transport in regions where the increased mesh resolution allows eddies to be resolved.

The NARRM and Standard configurations have completed 500-year pre-industrial climate control simulations and a 165-year historical period (1850-2015) simulation. The InteRFACE configuration has completed a 200-year pre-industrial control simulation. The HiLAT simulations have so far completed only multi-decadal integrations under fixed 1950 conditions. All simulations were run fully coupled (active atmosphere, ocean, land, and sea ice). Given the unequal simulation lengths, we focus on the shared time period (200 years) for any comparisons involving the NARRM, Standard, and InteRFACE configurations. For time series, the full 200 years are analyzed and climatologies are computed over the final 50 years. To compare to observational transects, we also present a climatology over the 1980-2014 period of the E3SM Standard and NARRM historical period simulations.

 

3rd QUARTER METRIC COMPLETED: 

Demonstrate THAT ANTARCTIC AND SOUTHERN OCEAN REGIONALLY REFINED MESH (rrm) IMPROVES FULLY COUPLED E3SM SIMULATIONS OF ANTARCTIC SUB-ICE-SHELF MELT FLUXES AND SOUTHERN OCEAN CLIMATE

Product Definition

This report summarizes progress towards improving the representation of the Southern Ocean climate and its coupled interactions with the Antarctic ice sheet using the U.S. Department of Energy’s (DOE) Energy Exascale Earth System Model (E3SM). In particular, we focus on its Regionally Refined Mesh (RRM) capability, which allows for the placement of high spatial resolution in specific regions of the globe and lower resolution in others. The result is high-resolution model fidelity in regions of interest at a fraction of the computational cost required by a global, high-resolution model configuration. In the Southern Ocean RRM (SORRM) configuration discussed below, the Southern Ocean and sea ice around Antarctica are resolved at 12-km resolution. Moving farther away from the Antarctic coast and out of the Southern Ocean, spatial resolution in the ocean and sea ice models smoothly transitions back to that of E3SM’s standard-resolution configuration of between 30 and 60 km (Figure 1). The transition in resolution across the Southern Ocean, from non-eddying to eddy-permitting around Antarctica, is critical for improvements in the representation of oceanographic processes and ice sheet and ocean interactions that govern the exchange of heat and freshwater between the ocean and Antarctica’s overlying ice shelves. These interactions are critical for accurate simulation of the “health” of Antarctica’s ice shelves under a changing climate. Ice shelves are important because they provide resistance to the flux of ice off of the Antarctic continent and into the ocean (often referred to as “buttressing”); ice-shelf degradation, e.g., through increased submarine melting, results in increased sea level rise from Antarctica via the dynamic connection between floating ice shelves and the grounded ice upstream (see e.g., Gudmundsson 2013, Sun et al. 2021).

In the following sections, we further describe the details of E3SM v2 standard and RRM configurations that are designed to simulate, explore, and improve our understanding of ice shelf and ocean interactions in the Southern Ocean and around Antarctica. These novel configurations are designed to allow for ocean circulation under ice shelves within fully coupled, global simulations (described in greater detail by Comeau et al. [2022] for E3SM v1). We first provide detailed visualizations of the differences in computational meshes employed for our standard-resolution and SORRM configurations. We then provide a comparison of several important simulation outputs at the scale of the entire Southern Ocean or Antarctic continent and relative to observations in order to demonstrate that outputs from our SORRM simulations are superior to those from standard-resolution configurations. Finally, we provide a more detailed, regional-scale comparison for several key coastal regions around Antarctica, again demonstrating that outputs from the SORRM configuration are superior to those from the standard resolution configuration.

Relative to standard-resolution E3SM configurations, we show that simulations conducted with our SORRM configuration demonstrate clear improvements in Southern Ocean processes important for controlling sub-ice-shelf melt rates, both at the whole-continent and regional scale. These improvements include: (1) large-scale volume transports, a prerequisite for accurate simulation of ocean circulation at regional scales (consistent with observations for SORRM configurations but too low and outside the range of observations for standard configurations); (2) ocean bottom temperature and salinity, important because these waters ultimately come into contact with the most sensitive parts of the ice shelves (over six important continental shelf areas, mean bias reductions using SORRM are 34% for temperature and 48% for salinity relative to using standard configurations); (3) broad, overall improvements in the representation of temperature, salinity, and density for Southern Ocean water masses critical for the accurate modeling of sub-ice-shelf circulation (high-salinity shelf water, HSSW, which is also a critical precursor to Antarctic bottom water, AABW, a driver of global ocean circulation). These improvements combine to provide time series of Antarctic sub-ice-shelf melt rates that are stable and closer to observational-based estimates in SORRM configurations than in standard, lower-resolution configurations. This conclusion applies at the whole-Antarctic scale, at the regional scale, and at the scale of individual ice shelves.

Product Documentation

The E3SM ocean and sea ice meshes presented here include the 60-to-30km “standard-resolution” mesh and the Southern Ocean Regionally Refined Mesh or “SORRM” (Figure 1). The standard resolution is 30 km at the Equator and decreases smoothly to 60 km in mid-latitude regions and then increases again to 35 km in high-latitude regions. This design was chosen because equatorial and high-latitude dynamics require higher resolution, while it is less important at mid-latitudes. The SORRM keeps this distribution in the Pacific, the world’s largest ocean, but adds refined cells of 12 km in the Southern Ocean, and the Antarctic continental shelf, coastlines, and below the ice shelves (Figure 2). The Southern Ocean dynamics are influenced by the Atlantic meridional overturning circulation, so mid-range cells of 40 km extend throughout the Atlantic to the Arctic Ocean.

In Figure 3, we show the SORRM configuration in additional detail, with mesh detail in particular regions of interest. Each regional zoom includes details for both the standard and SORRM configurations in order to provide a qualitative comparison for the additional detail afforded by the variable-resolution SORRM configuration. At low resolution, the small ice shelves, such as Pine Island Glacier and Thwaites Glacier, only contain two or three cells each, making sub-ice-shelf ocean dynamics and resultant melt rates unrealistic. The SORRM mesh, with 12-km cells, increases the number of cells by a factor of about 10 and improves the simulation of these processes.

For both our standard and SORRM configuration, mesoscale eddies are not fully resolved everywhere, and are represented through the Gent-McWilliams (“GM”) parameterization (Gent and McWilliams 1990). This parameterization is active in lower-resolution regions where the Rossby radius is not resolved, and inactive otherwise. For the standard-resolution configuration, GM is fully active for the entire globe. In the SORRM configuration, GM is off in the Southern Ocean because the Rossby radius is resolved in the refined region. The strength of the GM parameterization is linearly “ramped” down as a function of resolution, allowing for the improved representation of heat and mass transport due to ocean eddies. 

 

2nd QUARTER METRIC COMPLETED: 

Demonstrate Improved Representation of Ocean and Sea Ice Processes Including Arctic-Subarctic Oceanic Exchanges and Sea Ice Thickness Distribution Using Ocean and Sea Ice Regionally Refined Mesh (RRM) Capabilities

Product Definition

This report summarizes the progress that has been made to improve the representation of ocean and sea ice processes, including Arctic-Subarctic oceanic exchanges and sea ice thickness distribution, using the Department of Energy’s (DOE) Energy Exascale Earth System Model (E3SM) Regionally Refined Modeling (RRM) capabilities for the ocean and sea ice (OSI) model components. To that end, we present the results of simulations with an ocean/sea ice configuration of E3SM with regional grid refinement in the Arctic. This configuration, referred to as E3SM-Arctic-OSI (or E3SM-Arctic for short), has been developed by the HiLAT-RASM project in collaboration with E3SM team members. The grids of the ocean and sea ice components of E3SM-Arctic have a high spatial resolution (10 km) in the Arctic and subpolar North Atlantic Oceans and standard resolution (up to 60 km) elsewhere. The impacts of Arctic Ocean mesoscale are likely not fully accounted for in coarse resolution (i.e., > 25-km) models since the Rossby radius of deformation, required to resolve the ocean mesoscale (e.g., eddies, coastal currents and jets, exchanges through narrow gateways), is on the order of 10 km or less (Nurser and Bacon 2014). This motivated the development of E3SM-Arctic to enable a more realistic representation of Arctic-Subarctic oceanic exchanges and sea ice thickness.

In the next section, we describe the details of this configuration, followed by results that compare the performance of E3SM-Arctic in several key metrics against the standard resolution configuration of E3SM, referred to as E3SM-LR-OSI (E3SM-LR for short; Petersen et al. 2019). The results presented here are primarily based on version 1 of E3SM (Golaz et al. 2019) and have been documented in Veneziani et al. (in press). We will also present some preliminary results from E3SMv2. Our results demonstrate that the Arctic-refined configuration of E3SM provides significant improvements in the representation of features that are important for the Arctic Ocean and its interactions with lower latitudes. These improvements include the strength of the Atlantic Meridional Overturning Circulation (AMOC), ocean heat transport into the Arctic, the stratification of the Arctic, and sea ice deformation. No improvement is found in sea ice thickness distributions, which are controlled primarily by atmospheric forcing. We anticipate seeing more significant improvements in the fully-coupled simulations, where ocean-atmosphere-sea ice interactions can amplify the differences imposed by the more detailed representation of the ice pack (for instance, leads).

Product Documentation

The ocean and sea ice models of E3SM-Arctic use grids that have a high spatial resolution (10 km) in the Arctic and subpolar North Atlantic Oceans and standard resolution (up to 60 km) elsewhere. Figure 1 shows the resolution as a function of latitude used for E3SM-Arctic (red curves) and E3SM-LR (black). Both configurations have a comparable resolution in the southern hemisphere, with grid sizes ranging from 60 km at southern mid-latitudes to 30 km approaching the south pole and the equator. While E3SM-LR repeats this resolution structure in the northern hemisphere, the resolution in E3SM-Arctic transitions from 30 km at the equator to 10 km at the North Pole. In order to ensure that the exchanges between the North Atlantic and Arctic are accurately represented, the transition towards high resolution starts at lower latitudes (10°) in the Atlantic than in the Pacific (30°).

Figure 2 shows how E3SM-Arctic resolves the main narrow gateways in and out of the Arctic Ocean, which are not realistically represented in E3SM-LR. These passages have been shown to play an important role in the exchange of heat and freshwater between the Subpolar North Atlantic/Pacific and the Arctic Ocean (e.g., Smedsrud et al. 2013; Koenigk and Brodeau 2014; Maslowski et al. 2014; Woodgate and Peralta-Ferriz 2021; Zhang et al. 2021). In addition, many of these passages are relatively shallow (e.g., Bering Strait or Barents Sea Opening), and hence E3SM-Arctic has increased its vertical resolution to 80 levels, compared to 60 levels in E3SM-LR.

Mesoscale eddy transport in regions outside of the Arctic and pan-Arctic are parameterized using the Gent-McWilliams (GM) parameterization (Gent and McWilliams 1990). We achieve a regionally varying GM by making the GM coefficient dependent on the grid cell size. For the E3SM-Arctic configuration considered here, the GM kappa parameter varies linearly between zero for cell sizes below 20 km and a maximum value of 600 m2/s for cell sizes above 30 km. This implies that we effectively transition from GM-on to GM-off between 10°−28°N in the North Atlantic, and between 25°−50°N in the North Pacific (indicated by the white and red lines in Fig. 1). GM is turned on everywhere in E3SM-LR.

All versions of E3SM used in this study, including E3SM-Arctic, are forced by the Japanese atmospheric reanalysis, spanning the period from 1958 to 2016 (JRA55; Tsujino et al. 2018). We repeated this cycle three times to allow the ocean sufficient time to equilibrate.

 

1st QUARTER METRIC COMPLETED: 

Demonstrate Computational Feasibility of the Regional Refined Mesh (RRM) in the Antarctic and the Arctic

Product Definition

This report summarizes the progress that has been made to configure and generate regionally refined climate simulations in Version 2 of the Energy Exascale Earth System Model (E3SMv2) in a computationally feasible way on Department of Energy supercomputers. E3SM is designed to resolve targeted regions of the globe at high horizontal resolution within a standard-resolution global mesh to improve regional climate reconstructions and projections pertinent to particular science questions. For polar regions, this capability is required to understand feedbacks between global climatic change and regional evolution of the Arctic and Antarctic. More specifically, E3SM has been configured to determine how rapid changes in the cryosphere could evolve with the Earth system and contribute to sea level rise and increased coastal vulnerability. Reductions in the size of the Greenland and Antarctic ice sheets and Southern Ocean ice shelves are anticipated or already underway, and their future mass balance poses the greatest uncertainty in projections of 21st Century sea level rise. It is also important to understand climate feedbacks associated with a diminishing sea ice cover, which extends over 7-10% of the global ocean surface and helps insulate the Earth from solar heating, but has decreased in global extent over the past 40 years.

The capability to refine the Arctic and Antarctic in E3SM rests not just on simulating polar climate using tailored meshes for the ocean, sea ice, atmosphere, and land models in the coupled system, but also on efficient workflows to generate the meshes and configure E3SM to use them. Therefore, we break down the capability into two components. First, we describe the efficiency of the workflow to generate regionally refined meshes (RRMs). This has been greatly improved to reduce the time of configuring RRMs in E3SM from several months to as little as one week. Then, we demonstrate the computational cost of integrating two regionally refined versions of E3SM: one for the Arctic, the other for the Antarctic, relative to the standard-resolution configuration, as well as a global high-resolution configuration.

Product Documentation

(see noted tables and figures in Quarterly Metric document)

Generating regionally refined ocean-ice meshes for E3SM has become commonplace owing to great advances in the methods used to create unstructured grids for the Model for Prediction Across Scales (MPAS) within which the ocean (MPAS-O) and sea ice (MPAS-SI) are represented as components of E3SM. In this report, we use two illustrative examples of polar ocean sea ice mesh refinement, and compare it to the standard E3SM mesh as illustrated in Figure 1 for Arctic-North American and Southern Ocean refinements, and also to the E3SM globally high-resolution mesh summarized alongside these in Table 1.

The E3SM Version 1 (E3SMv1) mesh generation process for ocean and sea ice components was slow and laborious, requiring several days to generate a standard-resolution mesh and two to three weeks to generate a very high-resolution mesh. Breakthroughs in mesh generation algorithms and the human design interface reduced this generation time by a factor of 50 to 100 in E3SM Version 2 (E3SMv2), down to a few minutes for low resolution and several hours for high-resolution meshes. The mesh‑generation algorithm was upgraded from Lloyd’s algorithm, which is slow to converge, to the JIGSAW library described below, which achieves significant advances in speed by strategically adding and removing cells in the iterative process. Equally important, a convenient and well-documented interface, COMPASS (https://mpas-dev.github.io/compass), allows modelers to design variable-resolution meshes based on distance from coastlines and shapes drawn with an online tool, with geometric parameters that are easy to adjust on the fly. COMPASS tracks how each mesh was created to ensure long-term reproducibility. In combination, these improvements have led to a much faster turnaround in the mesh design, simulation, and feedback process, so that modelers can consult with domain experts to explore a number of configurations and produce the best simulations for the computational cost. This is the process used to generate the Arctic and Southern Ocean refinements in Figure 1, and has ensured, for example, that all major Arctic shipping pathways are open in E3SMv2. A practical demonstration of the mesh approval process for these respective meshes can be found on GitHub for the Arctic and Antarctic.

Leveraging the JIGSAW unstructured meshing library (Engwirda 2017) has enabled the creation of complex, variable-resolution meshes to resolve regional sea-ice (Turner et al. 2021), ocean (Hoch et al. 2020), and land-ice (Hoffman et al. 2018) dynamics. Compared to the initial optimization‑only meshing approaches pursued in E3SMv1 (Jacobsen et al. 2013), the unstructured meshing kernels in the JIGSAW library take a number of alternative pathways to solving the various computational-geometric and algorithmic problems inherent to the generation of the spherical Centroidal Voronoi Tessellations (CVTs) (Ringler et al. 2008) used in the MPAS-O and MPAS-SI dynamical cores.

  • Efficient initialization: The creation of optimal CVT meshes is known to be a difficult optimization problem (Du et al. 1999), requiring a potentially very large number of expensive global iterations to converge toward a high-quality mesh given an arbitrary initial distribution of points and cells. In the JIGSAW library, a multi-paradigm meshing strategy is used — first building an initial mesh based on a so-called off-center Delaunay-refinement scheme (Engwirda and Ivers 2016) to provide a high‑quality initial condition for the subsequent CVT mesh optimization passes. As per Figure 2 (a‑d), this fractal-like approach generates initial meshes that are already largely quasi-optimized, significantly reducing the burden on the final, computationally expensive, mesh optimization procedure.
  • Coupled geometry and topology optimization: The construction of optimal CVT meshes is a coupled optimization problem — requiring both a high-quality arrangement of mesh vertices (geometry) as well as optimal connectivity between mesh cells (topology). In the JIGSAW library, an expanded set of mesh optimization predicates is introduced (cell collapse, progressive refinement), as well as a nonlinear hill-climbing optimization schedule that focuses on improving the worst-quality cells in a mesh at each CVT iteration (Engwirda 2017, 2018). These methods improve the quality and robustness of the E3SMv2 meshing workflow, ensuring that the resulting polygonal-triangular meshes (see Figure 2e) are well conditioned with respect to MPAS-type numerical methods. A key difficulty associated with the E3SMv1 meshing approach was the generation of invalid staggered grid configurations (see Figure 1f) in which adjacent polygonal and triangular cells were not consistently staggered with respect to one another. These poor-quality grid configurations lead to a breakdown in the MPAS discretization, and thus limited the use of varying mesh resolution in E3SMv1. This lack of robustness has been remedied in E3SMv2, with JIGSAW’s enhanced mesh optimization strategies leading to valid, well-conditioned, staggered meshes in complex, regionally refined cases.
  • Minimal algorithmic complexity: Significant computational efficiency can be gained by exploiting advanced data-structures and algorithmic constructs to reduce the expense of operations on large‑scale meshes. The JIGSAW library is structured around efficient, local updates to global mesh data-structures, leading to a quasi-optimal O(nlog(n)) implementation. An approach based on linear, global data-structures was taken in the original E3SMv1 meshing algorithm, requiring a much-expanded O(n3) overall operations count. Noting the large size of meshes for E3SM (e.g., n >= 1 x 105), the reduction of algorithmic complexity from O(n3) to O(nlog(n)) represents an orders‑of‑magnitude improvement in runtime.

Taken together, these improvements to the E3SMv2 meshing kernels represent a significant expansion to the regional-refinement capabilities available for the MPAS-O and -SI dynamical cores, enabling simulations incorporating complex patterns of regional, variable resolution to resolve fine-scale dynamics of interest, as illustrated for the Arctic and Antarctic in Figure 1.

The E3SM Atmosphere Model (EAM) adopts a highly scalable spectral element dynamical core that supports variable resolution through regional mesh refinement (Dennis et al. 2012, Guba et al. 2014, Taylor 2021). The Regionally Refined Mesh (RRM) capability in EAM preserves its key conservation and scalability features and has been demonstrated to improve simulations over refined regions comparable to globally uniform high resolution without negatively impacting the performance elsewhere (Rasch et al. 2019, Tang et al. 2019, Zarzycki et al. 2015). As an example, the Antarctic mesh in Figure 3 enhances resolution over the Southern Ocean while also improving the representation of complex terrain on the Antarctic continent. It is designed to improve simulations of consequential synoptic-to-local-scale phenomena, including meso-scale structure along Southern Hemisphere storm tracks and katabatic winds over coastal Antarctica.

The meshes in Figure 3 have uniform 25-km resolution within the refined area and gradually transition to EAM’s standard resolution of 110 km elsewhere. The established workflow, documented on the E3SM Confluence space, is used to create the mesh (based on Taylor and Zarzycki 2014) and generate the supporting files (Hillman et al. 2021) for simulations with E3SM. The RRM was generated using the Spherical Quadrilateral Mesh Generator (SQuadGen, Ullrich 2015), following a procedure developed by Guba (2014), which is the preferred approach for mesh refinement over an unstructured area (i.e., with non-functional-form periphery). SQuadGen uses a specified PNG image in grayscale to define the refinement area, with the level of refinement determined by shading, which can be either from white to black or the other way around. The transition zone can thus be controlled via gradient shading. The user‑specified PNG image typically uses the standard-resolution (110 km, Table 1) global mesh in equidistant cylindrical projection as the background to aid the placement (or “drawing”) of the region of interest for refinement. This can be done using an image editor (e.g., Photoshop, GIMP). The PNG image saved from the overlaid drawing will be the image to feed SQuadGen to generate the RRM mesh in exodus format. The refinement level is specified as a command-line option to SQuadGen. The generated mesh is visually inspected using grid-plotting utilities, refined through further editing of the specified PNG file as needed, and iterating on this process until arriving at a final satisfactory mesh.

The new grid can be tested by running an idealized baroclinic instability problem (Jablonowski and Williamson 2007) in HOMME (the spectral element dynamical core) standalone mode (following Taylor and Zarzycki 2014). The quality of the RRM can also be examined in terms of the degree of element distortion computed when running HOMME in E3SM or standalone. Ultimately, the RRM mesh needs to be tested in the full EAM or E3SM. To run the full model with the RRM mesh, supporting files are required to define the dynamical structure, initialize the atmosphere model, and enable the coupling of the atmospheric RRM with the other E3SM components. These include the initial condition files, the topography file, the mapping and domain files, and the corresponding control volume mesh file that is used in E3SMv2 with physics grid (pg2). The full procedure is documented in the step-by-step guide for running E3SM on a new grid (Hillman et al. 2021). The dynamical core parameters for running the RRM are typically set to be the same as required by the finest grid in the RRM. The model physics step in E3SMv2, however, is set to be the same as required by the base grid (i.e., 110-km-resolution grid for this Antarctic RRM). This choice benefits the simulation throughput and minimizes the need for tuning, as the standard model has been well tuned and any retuning for the interest of the refined region could have global implications. With these parameter settings, the EAM running on this Antarctic RRM is 2.5 times as expensive as the standard model, only slight larger than the ratio of the number of spectral elements between the two meshes.