You are viewing a Web site, archived on 11:40:03 Oct 15, 2004. It is now a Federal record managed by the National Archives and Records Administration.
External links, forms, and search boxes may not function within this collection.

Parallel Programming Tools

Forecast Systems Laboratory
Aviation Division
Advanced Computing Branch

There are several obstacles to the operational use of current distributed memory HPC systems: portability, programmability, and performance.   Historically NWP models have been large codes that did not run efficiently on HPC systems without some restructuring.   Typically a message passing library is used such as PVM or MPI to handle parallel programming issues including inter-process communication, process synchronization, global operations, data decomposition, and I/O handling.   The development of an efficient message passing standard (currently MPI) supported by most vendors has improved the portability of models on distributed memory HPCs.   However, MPI is sufficiently low level in nature that it can be difficult to use.  To speed code parallelization, the ACB has developed a high level tool called the Scalable Modeling System (SMS) that simplifies the task required to port and run NWS models on HPC's while offering good scalable performance.

The Scalable Modeling System

The SMS is a directive-based parallelization tool that translates Fortran code into a parallel version that runs efficiently on both shared and distributed memory systems including the IBM SP2, Cray T3E, SGI Origin, Sun Clusters, Alpha Linux clusters and Intel Clusters.  This software has been used sucessfully since 1993 to parallelize and run many oceanic and atmospheric models, some of which produce weather forecasts for the National Weather Service.

Models Parallelized using SMS

Atmospheric Models
  • 32 km ETA Model
    • National Centers for Environmental Prediction (NCEP)
  • Rapid Update Cycle Model (RUC)
    • NOAA's Forecast Systems Laboratory (NOAA/FSL)
  • Quasi-Non Hydrostatic Model (QNH)
    • NOAA's Forecast Systems Laboratory (NOAA/FSL)
  • Global Forecast System (GFS)
    • Central Weather Bureau (CWB)    - Taiwan
  • Typhoon Forecast System (TFS)
    • Central Weather Bureau (CWB)    - Taiwan
  • Atmospheric Chemistry Model (NALROM)
    • NOAA's Aeronomy Laboratory (NOAA / AL)
Oceanic Models
  • Princeton Ocean Model (POM)
    • NASA Goddard
    • NOAA / Environmental Technology Laboratory (NOAA/ETL)
  • Regional Ocean Modeling System (ROMS)
    • Rutgers University
    • National Institute of Water and Atmospheric Research (NIWA)    - New Zealand
    • University of Alaska at Fairbanks (UAF)
    • NOAA / Pacific Marine Environmental Laboratory (NOAA/PMEL)
  • Hybrid Coordinate Ocean Model (HYCOM)
    • Los Alamos National Laboratory

These models contain structured regular grids that are resolved using either finite difference approximation or Gauss-Legendre spectral methods.  SMS also provides support for mesh refinement, and can transform data between grids that have been decomposed differently (eg. grid and spectral space).  While the tool has been tailored toward finite difference approximation and spectral weather and climate models, the approach is sufficiently general to be applied to other structured grid codes.  As the development of SMS has matured, the time and effort required to parallelize codes for MPPs has been reduced significantly.  Code parallelization has become simpler because SMS provides support for advanced operations including incremental parallelization and parallel debugging.

High Performance: SMS provide a number of performance optimizations.  The SMS run-time libraries have been optimized to speed inter-processor communications using techniques such as aggregation.  Array aggregation permits multiple model variables to be combined into a single communications call to reduce message-passing latency.  SMS also allows the user to perform computations in the halo region to reduce communications.  High performance I/O is also provided by SMS.  Since atmospheric models typically output forecasts several times during a model run, SMS can output these data asynchronous to model execution.   These optimization can lead to significantly faster execution times.

SMS Documentation

The following documents are available in MS Office 2000 Word or PDF format  Viewing or printing files downloaded in PDF format will require the use of the freely available Adobe Acrobat Reader.

SMS Software Downloads

SMS is supported on the following systems: IBM-SP2, SGI-Origin, Cray T3E, Clusters of Alpha, Intel or Sun processors.

SMS software is freely available but there are some restrictions.  To install, gunzip the downloaded file, un-tar it, cd to the unloaded SMS directory and then follow the instructions in file INSTALL.

Software patches are also available for user reported bugs. These patch releases have not been tested on all platforms where SMS is supported. When these changes are integrated into a full release, they will be tested on all machines.

All questions regarding SMS may be directed to sms-info.fsl@noaa.gov

Additional Information

Once you begin using SMS to parallelize code, the following information may be useful. Any additional questions should be directed to sms-info.fsl@noaa.gov

Other Activities and Interests

Prepared by Mark Govett, Mark.W.Govett@noaa.gov
Date of last update:    September-2003