Scalable Earth System Models (ScalES)

Aims & Objectives

The ScalES project was initiated by the German High Performance Computing Centre for Climate- and Earth System Research (DKRZ) within the "IKT 2020 - Forschung für Innovation" program of the Federal Ministry of Education and Research (BMBF) to address some shortcomings of today's Earth-System-Models.

The world is facing a climate change that might be the greatest challenge mankind has had to deal with. The importance of this issue is so paramount, that the outcome of it is not only going to decide our mere existence but the existence of all species on our planet. The NumHPC group decided to take on this challenge by helping to develop and improve Earth-System-Models (ESM) that are used to predict climate change. On the results of those predictions the Intergovernmental Panel on Climate Change (IPCC) bases its Climate Change Reports, which are an important input for governments around the world to decide on their actions against climate change.

The five essentials goals of ScalES are

  • improving parallel I/O on HPC architectures,
  • developing dynamic load distribution for ESMs,
  • improving the OASIS4 coupler for different ESM components and grids,
  • improving performance of ESMs by better algorithms and adaption to system architecture,
  • achieving sustainability and utilization of the project's results.

Research Topics

Parallel I/O library and I/O servers

Data transport and storage in todays earth system models is increasingly time consuming due to its essentially serial implementation and continually increasing parallelization and resolution. To fully exploit state-of-the-art parallel filesystems like GPFS or Lustre, the model I/O needs to be restructured and parallelized. This projects aims to develop such an implementation and make it available in a library covering multiple libraries.

Parallelization and Performance modules

To mitigate both dynamic and static load imbalance a collection of partitioning routines will be implemented beneath a common interface. Additionally helper routines for associated phenomena in parallelized programs will complement this core with e.g. free-form boundary exchanges. The library can be inspected at the dkrz project server: scales-ppm The documentation is at http://scales.dkrz.de/scales-ppm

Communication library UniTrans

Many earth system models use hand written code to implement the decomposition of a global simulation domain into smaller parts which are then distributed among the available processes. Depending on the kind of evaluated physical quantity each part often needs to access data from neighboring parts which leads to inter-process communication - mostly expressed with the MPI communication library.

UniTrans is a library, currently layered on top of MPI, and offers a flexible and general concept to describe decompositions and the transition between them. This includes, e.g., global transposition as required in ECHAM or local boundsexchange as used in MPIOM. The concept simplifies the development of model communication and relieves the models from machine dependent communication performance aspects. The UniTrans libray was implemented by the project partner IBM. The pending release process is expected to result in an open source library soon.

Improvements of OASIS 4 coupler

One of the core features of modern ESM is the coupler, which handles the communication between the individual climate components. It has to be easy to use, allow the easy exchange of models and run performant on different computer architetures. The coupler OASIS4, which is used in this project, is one of the standard tools in the climate research community.

Solver for ocean level with machine targetted parametrization

Improvement of the current solver by implementing the cg method and additive Schwarz method. Usage of hybrid parallelization to exploit different hardware architectures more efficiently. The solver is included in the scales-ppm library. A How-To is at http://scales.dkrz.de/scales-ppm/solver_howto.html

Funding

This project was funded by the BMBF. Duration: 01.01.2009-31.12.2011.

Partners

These five goals will be accomplished in three years until September of 2011 under the coordination of the DKRZ  and its partners

  • Alfred-Wegener-Institute for Polar and Sea Research (AWI )
  • Max-Planck-Institute for Meteorology (MPI-M )
  • Max-Planck-Institute for Chemistry (MPI-C )
  • (Technical) University of Karlsruhe (UniKA )
  • IBM 

and its associated partner NEC Laboratories Europe, IT Research Division (NLE-IT ).

People from EMCL

Project Link