MITgcm Coupled 2.8 degree Atmosphere Ocean and Seaice Configuration - JMC Code L First Code Improvement
These pages describe an intermediate complexity coupled configuration of
MITgcm that has been adpated to fit within ESMF. The coupled configuration is being
used as part of the ESMF development process to evaluate ESMF scaling in a real code.
- The results presented here use the ESMF_1_0_4 internal release which is the latest
release available. Release notes for ESMF
internal releases
can be found under the
development
link at the ESMF project web site.
- A description of the MITgcm configuration used for this test is given below.
- The code for this configuration can be downloaded from here.
- A users guide for the coupled configuration can be found here and downloaded
from here.
- Output fields from the simulation are shown below.
- Time to solution summary can be found in the time to solution section below.
- Results of scaling tests can be found in the configuration scaling section below.
- Plans for future development are described in the future development plans section below.
The configuration described here is the ESMF
JMC Code L First Code Improvment configuration. This is a coupled atmosphere ocean simulation
that uses MITgcm for both fluids. The isomorphic equation set that is used
in both fluids is described in the Continuous Formulation section
of the first chapter of the MITgcm release1
documentation.
Additionally an embedded thermodynamic ice model is included for
sea-ice covered regions. Land surface processes are driven by climatologies
of soil moisture, land temperature and albedo.
The fluid equations are stepped forward on a latitude longitude grid using
the time stepping procedure outlined in the Time Stepping
section of the second chapter of the MITgcm release1
documentation.
The sections Spatial discretization of the dynamical eqautions
and Tracer equations describe the form of the
horizontal (latitude-longitude) and vertical (pressure in the
atmosphere and height in the ocean) spatial gridding.
Computationally the JMCl first-code-improvement configuration uses a single
executable with two major ESMF gridded components that are derived
from the latest stable MITgcm code base. The single executable, ESMF gridded component
form of the configuration is automatically generated from sources
extracted from the MITgcm source tree revision
checkpoint52, this is a stable
development revision that includes MITgcm code modifications until November 7th 2003.
Additional MITgcm packages are embedded in the
configuration to support sea-ice and surface boundary flux
calculations.
Data transport between ESMF gridded components is supported by
two unidirectional ESMF coupler components. These two couplers map import and
export states between the atmosphere and ocean with embedded sea-ice
gridded components. The coupler components take the place of a separate
coupler executable that is referenced in the JMCl baseline milestone.
The code structure for each component internals follows the
outline give under the Browse Code tab of the MITgcm release1
web page.
The overall approach to parallelism is described in the
Software Architecture chapter of the MITgcm release1
documentation.
In the JMCl first-code-improvement configuration
the WRAPPER functions
have been substituted by the ESMF Infrastructure fields and grids and
communication layer calls; using ESMF_FieldHalo() and
ESMF_FieldAllGather() respectively. The scaling curve
in the scaling section below shows the impact
of this substitution.
The directory structure for the contents of the downloaded
source tree are described in a README file in that
source tree. The layout and build procedures follow closely the
arrangement described in the MITgcm documentation, however,
for this configuration the code organization contains
independent MITgcm code trees for each of the two separate
gridded components. The standard MITgcm genmake
tool is used to build the code base, but it is driven from a
higher level build script described in the
confiugration users guide and that includes
automated script processing that maps the non-ESMF codebase
into an ESMF ready form.
The downloaded configuration is set to simulate a
ten-day period following a fifty year spin up. The figure below shows
the monthly average of the ocean currents for the upper ocean (25m and 170m deep) at the
start of the August immediatly prior to the fifty year pickup.
Velocity vectors for the top and third ocean levels (25m and 170m) after
49 years and 8 months of integration
Atmospheric winds for the same period at 950mb and 500mb are shown below.
Wind vectors from the lower and mid atmospheric levels (950mb and 500mb) after
49 years and 8 months of integration
The time to solution for a 16 processor run of the configuration using ESMF infrastructure
and superstructure is measured as 249 seconds on 16 processors of the 64
processor Pentium 4 cluster at MIT. This is the time
for a ten-day simulation that starts from a spun-up fifty year state.
The base time-step for this configuration is 450 seconds.
In a ten day period all the elements of the coupled system are excercised
many times so that the timing is representative of extended simulations.
The time to solution has also been measured at other processor sizes
in order to allow a scaling profile to be calculated.
The time to solution for a 16 processor run of the same configuration without ESMF
is measured as 120 seconds on 16 processors of the 64
processor Pentium 4 cluster at MIT.
The non-ESMF time to solution has also been measured at other processor sizes
in order to allow a scaling profile to be calculated. Both the ESMF and non-ESMF scaling
are shown below.
The plot below shows the number of simulated years per day that the MITgcm coupled system
currently achieves for different processor counts with and without ESMF. Three curves are
shown. The black circle curve shows the scaling measured on Halem and described in
the baseline scaling report. The blue circle curve
shows scaling on the MITgcm Pentium 4 cluster for the non-ESMF form of the
JMCl first-code-improvement configuration. This scaling is comparable to the
Halem measurements. The green triangle curve shows the scaling of the ESMF
form of the JMCl first-code-improvement configuration. For this development
release of ESMF the scaling is below both
the Halem baseline and the current code measurement. A final point, a red cross, shows the
years per day for the current MITgcm configuration using a cube sphere grid and operating
outside ESMF. This calculation will form the basis of the milestone G ESMF performance
evaluation. The current ESMF implementation only supports cartesian toplogies and so
can not be applied to cube sphere grid configurations.
Scaling of MITgcm with processor count for ESMF based and non-ESMF based configurations.
Throughput in years of coupled simulation per day for 2.8 degree
couple configuration JMC Code L First Improvement.
During the course of the ESMF project the coupled code that forms the basis for JMC Code L will evolve
in all of the following areas
- Further framework functionality will be introduced as the framework implementation hardens and broadens in extent.
- A fully dynamic ice component will be deployed.
- A simplified interactive land surface component will be deployed.
- An flexible ensemble driver will be overlaid on the configuration.
- Additional options for automatic differentiation will be introduced.
These developments will affect both the scaling and the time to solution for the
test problem in both its non-framework compliant and framework compliant forms.
Authors: Chris Hill, Erica Peterson - December 2003.
Department of Earth, Atmospheric and Planetary Sciences, MIT