NERSC logo National Energy Research Scientific Computing Center
  A DOE Office of Science User Facility
  at Lawrence Berkeley National Laboratory
 

IBM SP — Seaborg

About Seaborg

Seaborg

The NERSC IBM SP RS/6000, named Seaborg, is a distributed memory computer with 6,080 processors available to run scientific computing applications. Each processor has a peak performance of 1.5 GFlops.

The processors are distributed among 380 compute nodes with 16 processors per node. Processors on each node have a shared memory pool of between 16 and 64 GBytes (4 have 64 GBytes; 64 have 32 GBytes; 312 have 16 GBytes). An SP node is an example of a Shared Memory Processor or SMP.

The compute nodes are connected to each other with a high-bandwidth, low-latency switching network. Each node runs its own full instance of the standard AIX operating system. The disk storage system is a distributed, parallel I/O system called GPFS. Additional nodes serve exclusively as GPFS servers.

Seaborg's network switch is the IBM "Colony" which is connected to two "GX Bus Colony" network adapters per node. For more technical information about the switch, see the IBM Redbook Understanding and Using the IBM SP Switch.

Seaborg specifications

NERSC's Seaborg System
Total nodes416
 Compute380
 GPFS20
 Login6
 Network2
 Spare8
Total Disk44 Terabytes
NERSC Seaborg Nodes
IBM designationNighthawk
Number of CPUs per node16
Physical memory16-64 GB (4 have 64 GB; 64 have 32 GB; 312 have 16 GB)
Number of network adapter cards for inter-node communication2 (csss denotes the union of both cards)
POWER 3 Processor
Clock speed375MHz
FP Results/Clock4
Peak Performance1.5 Gflops
L1 Instruction Cache32 KB
L1 Data Cache Line Size128 bytes
L1 Data Cache64 KB
L2 Cache8192 KB

More technical information about SP POWER3 processors is available in the IBM Redbook POWER3 Introduction and Tuning Guide.

Seaborg Configuration

  • 416 16-processor nodes (with 64G, 32G, or 16G memory)
  • 375 MHz POWER 3+ processors
  • 1.5 GFlops/sec peak processor speed
  • 380 compute nodes (6,080 processors)
  • 6 login nodes
  • 20 nodes supporting the General Parallel Filesystem (GPFS)
  • 2 network node
  • 8 service nodes
  • 44 TB of disk space in GPFS
One Seaborg Node

Seaborg Node


LBNL Home
Page last modified: June 09 2004 10:07:24.
Page URL: http://www.nersc.gov/nusers/resources/SP/?PHPSESSID=6411bac2b781f7e4cb76ff09b775738e
Contact: webmaster@nersc.gov
Privacy and Security Notice
DOE Office of Science