Bypass Chapter Navigation
Contents  
Foreword by Walter Cronkite  
Introduction - The National Science Foundation at 50: Where Discoveries Begin, by Rita Colwell  
Internet: Changing the Way we Communicate  
Advanced Materials: The Stuff Dreams are Made of  
Education: Lessons about Learning  
Manufacturing: The Forms of Things Unknown  
Arabidopsis: Map-makers of the Plant Kingdom  
Decision Sciences: How the Game is Played  
Visualization: A Way to See the Unseen
Environment: Taking the Long View  
Astronomy: Exploring the Expanding Universe  
Science on the Edge: Arctic and Antarctic Discoveries  
Disaster & Hazard Mitigation  
About the Photographs  
Acknowledgments  
About the NSF  
Chapter Index  
Visualization: A way to see the unseen
 

Visualizing a Virtual Reality

Other researchers are creating virtual realities—computer-driven worlds where everything is interconnected, allowing exploration on a level so DNA-Eco RI ineraction model extraordinary it approaches science fiction.

In previous studies of the Chesapeake Bay, scientists had to measure the wind, current, salinity, temperature, and fish populations separately. But with a virtual reality model, all the elements come together. Glen Wheless, a physical oceanographer at Old Dominion University, worked with William Sherman, a computer scientist at the National Center for Supercomputing Applications, to create a dynamic model of the Atlantic Ocean's saline waters converging with fresh water from more than 150 creeks and rivers that flow into the bay.

The model has given scientists new insights into the ways in which fish larvae are transported around the estuary; scientists are learning, for example, that they had previously underestimated the influence of wind, tides, and runoff.

The Chesapeake Bay virtual reality model is different from a computer animation in that it is interactive. Researchers can continually update the data and re-run the model. Computer animations, for all their explanatory power, cannot accommodate this demand; once completed, they are not easily changed.

Virtual environments are presented to the viewer through wide-field displays. Sensors track the viewer's movements through the data and update the sights and sounds accordingly. The result is a powerful mechanism for gaining insight into large, multidimensional phenomena. The Chesapeake Bay simulation was designed in one of the country's leading virtual environments for science, CAVE, which was pioneered with NSF support by the Electronic Visualization Lab at the University of Illinois at Chicago. CAVE is an acronym for Cave Automatic Virtual Environment, as well as a reference to "The Simile of the Cave" in Plato's Republic, which explores the ideas of perception, reality, and illusion through reference to a person facing the back of a cave where shadows are the only basis for understanding real objects.

CAVE is a darkened cubicle measuring ten by ten by nine feet. Sound and three-dimensional images derived from background data are projected onto three walls and the floor. Wearing special glasses, visitors get a sensation of stepping inside the simulation.

CAVE's technology has been used for many simulations, perhaps the most famous of which is Cosmic Voyage, an IMAX film that made its debut in 1996 at the Smithsonian National Air and Space Museum in Washington, D.C. The museum cosponsored the film project with NSF and Motorola. Cosmic Voyage includes a four-minute segment of research-quality scientific visualization. The segment tells a story that begins shortly after the Big Bang, continues through the expansion of the universe and the formation of galaxies, and ends with the collision of two spiral galaxies. The segment is the result of the collaborative efforts of NCSA scientific visualization experts, NSF-supported astronomers, two movie production companies, and numerous high-performance computing machines at multiple centers.

Donna Cox, professor of art and design at the University of Illinois, Urbana-Champaign, choreographed the various parts of the simulation segment. For the camera moves, she worked with staff of the Electronic Visualization Laboratory to create a voice-driven CAVE application called the Virtual Director, a virtual reality method for directing the computer graphics camera for real-time playback or animation recording. Approximately one-half of the sequence—the collision and the merging of two spiral galaxies—is based on a simulation carried out by Chris Mihos and Lars Hernquist of the University of California, Santa Cruz, on the San Diego Supercomputer Center's CRAY C90 system. As the galaxies merge and then draw apart, tidal forces and galactic rotation cause the galaxies to cast off stars and gas in the form of long, thin "tidal tails." The compression of interstellar gas into the merged galaxies fuels an intense burst of star formation. Mihos and Hernquist found that increasing the resolution of their simulation led to new science, "particularly," says Mihos, "the large number of small, condensing gas clouds in the colliding galaxies that could be related to the formation of young, luminous star clusters or small dwarf galaxies, which are seen in many observed galaxy collisions."

In February 2000, Passport to the Universe debuted at New York's Hayden Planetarium to critical praise. The digital film, made using Virtual Director software and other high-end computing and visualization resources from both the Alliance and NPACI, combines images of actual astronomical objects with simulations made by cosmology researchers to provide audiences with an unparalleled depiction of intergalactic travel.

 
     
PDF Version
Overview
Visualizing Science in Action
Worth? Data Points
Art & Science? to Numbers
Staking the Pioneers: The 1960s - 1990s
Visualization: Back to the Future
Visualizing a Virtual Reality
Computer Graphics: A Competitive Edge
A Panoply of Applications
Computer Grapihics: Into the Marketplace
To Learn More ?
 

Search   |   Site map   |   NSF Home   |   OLPA Home   
|   Questions |