Bypass Chapter Navigation
Contents  
Foreword by Walter Cronkite  
Introduction - The National Science Foundation at 50: Where Discoveries Begin, by Rita Colwell  
Internet: Changing the Way we Communicate  
Advanced Materials: The Stuff Dreams are Made of  
Education: Lessons about Learning  
Manufacturing: The Forms of Things Unknown  
Arabidopsis: Map-makers of the Plant Kingdom  
Decision Sciences: How the Game is Played  
Visualization: A Way to See the Unseen
Environment: Taking the Long View  
Astronomy: Exploring the Expanding Universe  
Science on the Edge: Arctic and Antarctic Discoveries  
Disaster & Hazard Mitigation  
About the Photographs  
Acknowledgments  
About the NSF  
Chapter Index  
Visualization: A way to see the unseen
 

Art and Science: An
Alternative to Numbers

DeFanti and his colleague Maxine Brown summarized reasons for the booming popularity of visualization in Advances in Computers (1991):

"Much of modern science can no longer be communicated in print; DNA sequences, molecular models, medical imaging scans, brain maps, simulated flights through a terrain, simulations of fluid flow, and so on all need to be expressed and taught visually…. Scientists need an alternative to numbers. A technical reality today and a cognitive imperative tomorrow is the use of images. The ability of scientists to visualize complex computations and simulations is absolutely essential to ensure the integrity of analyses, to provoke insights, and to communicate those insights with others."

Over the years, two basic types of drawing systems have vied for the attention of both developers and users—vector graphics and raster graphics. Vector graphics systems are based on specifying the location of points on an X and Y coordinate system and connecting the points with lines. The basic drawing element of vector graphics is the line, created by an electron beam in the monitor as it moves directly from one set of coordinates to another, lighting up all the points in between. By contrast, the electron beam in the monitor of a raster graphics system scans across the screen, turning on specific picture elements (which came to be called pixels) in a pre-defined grid format.

While the precision of vector graphics was well suited to mechanical drawing, computer-aided design and manufacturing, and architectural computer graphics, raster graphics opened up possibilities in other areas and brought many more types of people into the world of computer graphics. It was perhaps the use of raster graphics in television advertising, including titles for network specials, that brought the public's attention to the potential of computer graphics. The low resolution of the television screen and the short viewing time—measured in seconds—called for relatively few calculations and was therefore less expensive in terms of power, speed, and memory.

Today, scientific visualization embodies the results that NSF hoped to achieve in funding the supercomputing centers: to find answers to important scientific questions while advancing both the science of computing and the art of using computer resources economically.

In 1992, the four supercomputer centers then supported by NSF (National Center for Supercomputing Applications in Chicago and Urbana-Champaign, Illinois; Pittsburgh Supercomputing Center; Cornell Theory Center; and San Diego Supercomputer Center) formed a collaboration based on the concept of a national MetaCenter for computational science and engineering. The center was envisioned as a Virtual Director application growing collection of intellectual and physical resources unlimited by geographical or institutional constraints.

In 1994, the scientific computing division of the National Center for Atmospheric Research in Boulder, Colorado, joined the MetaCenter. The five partners, working with companies of all sizes, sought to speed commercialization of technology developed at the supercomputer centers, including visualization routines. An early success was Sculpt, a molecular modeling system developed at the San Diego Supercomputer Center. It earned a place on the cover of Science and has now been commercialized by a start-up company.

The concept of a national, high-end computation infrastructure for the U.S. science and engineering community has been greatly expanded since 1997, when the National Science Board, NSF's governing body, announced the Partnerships for Advanced Computational Infrastructure (PACI) as successor to the NSF Supercomputer Program. PACI supports two partnerships: the National Computational Science Alliance ("the Alliance") and the National Partnership for Advanced Computational Infrastructure (NCAPI). Each partnership consists of a leading edge site—for the Alliance it is the National Center for Supercomputing Applications in Urbana-Champaign, while the San Diego Supercomputer Center is the leading edge site for NPACI—and a large number of other partners. NSF recently announced an award to the Pittsburgh Supercomputing Center to build a system that will operate at speeds well beyond a trillion calculations per second. The Terascale Computing System is expected to begin operation in early 2001, when Pittsburgh will become PACI's latest leading-edge site.

 
     
PDF Version
Overview
Visualizing Science in Action
Worth? Data Points
Art & Science? to Numbers
Staking the Pioneers: The 1960s - 1990s
Visualization: Back to the Future
Visualizing a Virtual Reality
Computer Graphics: A Competitive Edge
A Panoply of Applications
Computer Grapihics: Into the Marketplace
To Learn More ?
 

Search   |   Site map   |   NSF Home   |   OLPA Home   
|   Questions |