|
Worth at Least a
Thousand Data Points
The increased ability of scientists and engineers to model complex events is a direct result of NSF's investment in supercomputing centers. These university-based research facilities, started in the 1980s, gave researchers around the country access to the computational power they needed to tackle importantand difficultproblems. Visualization, while not an explicit goal of the supercomputing centers, quickly emerged as a way to cope with the massive amounts of scientific data that had been pouring out of computers since the 1960s. "We became very good at flipping through stacks of computer printouts," recalls Richard Hirsh, a specialist in fluid dynamics who is now NSF's deputy division director for Advanced Computational Infrastructure and Research. "But we realized that, at some point, people needed to see their solutions in order to make sense of them."
Humans are adept at recognizing patterns, Hirsh says, especially patterns involving motion. One of the early visualization success stories was a model of smog spreading over southern California, a model so informative and realistic that it helped to influence anti-pollution legislation in the state. As the cost of computer memory dropped and computer scientists began finding more applications for visualization techniques, the scientific community began to take notice.
The NSF Panel on Graphics, Image Processing, and Workstations published its landmark report Visualization in Scientific Computing in 1987. "ViSC [visualization in scientific computing] is emerging as a major computer-based field," the panel wrote. "As a tool for applying computers to science, it offers a way to see the unseen
[it] promises radical improvements in the human/computer interface."
The NSF report was accompanied by two hours of videotape demonstrating the potential of the new tool.
"Before the publication of the report, the opinions and observations of many well-known and respected computer graphics experts were of little concern to the scientific and computing establishments," recalls Tom DeFanti, director of the Electronic Visualization Laboratory at the University of Illinois at Chicago and co-editor of the ViSC report. Today, he says, "their comments are sought afterto educate the public, to influence industry research, and to identify new scientific markets."
NSF earmarked funds for visualization at the supercomputing centers from 1990 to 1994. During that time, application of visualization techniques spread. Since 1997, NSF's Partnerships for Advanced Computational Infrastructure (PACI) program has stimulated further advances in areas ranging from sophisticated tools for managing, analyzing, and interacting with very large data sets to collaborative visualization tools to enable researchers from far-flung areas to work interactively on a real-time basis. Applications now span the whole of contemporary science. For example:
Molecular biologists use modeling to depict molecular interaction.
Astronomers visualize objects that are so far away they cannot be seen clearly with
most instruments.
Medical researchers use computer visualization in many diagnostic techniques,
including the Magnetic Resonance Imaging (MRI) system that produces
three-dimensional images of the body.
|
|