These eight posters were developed for presentation on May 9, 2000 at a luncheon convened by Congressman James Sensenbrenner, Chair of the House Science Committee, and attended by approximately 90 people, mostly Congressional staffers. Representatives from MIT, Old Dominion University (yours truly), Princeton, Rutgers, the University of California, and the University of Washington were on hand to respond to questions.
The purpose of the luncheon was to feature the basic science research performed by the U.S. Department of Energy in collaborations with universities. DOE has proposed a major basic research initiative entitled Scientific Discovery through Advanced Computing (SDAC), to complement the very successful Accelerated Strategic Computing Initiative (ASCI) currently driving research at the DOE defense programs laboratories, and jointly undertaken at 19 U.S. universities, including the Computer Science and Mathematics & Statistics Departments at ODU and the Institute for Computer Applications in Science and Engineering at NASA Langley.
The posters are by colleagues of mine in the Center for Advanced Scientific Computing (CASC) at the Lawrence Livermore National Laboratory. The eighth one describes two projects undertaken by my ODU CS students Satish Balay and Dinesh Kaushik, now at the Mathematics & Computer Science Division at Argonne National Laboratory.
DOE Research Enables World-Class Computational Modeling and Simulation
My originally proposed title:
"Computational Simulation that leads the Real World".
(Oh well, puns don't go over well in this business.)
My originally proposed title: "Computational Simulation that leads the Real World". (Oh well, puns don't go over well in this business.)
Accessible Software Tools Simplify Complex Simulations
Basic computer science and applied mathematics research at the DOE national laboratories has lead to the development of code frameworks that make complex simulation technology easily accessible by DOE and university scientists. By using modern object-oriented code development approaches, a research team at Lawrence Livermore and Los Alamos National Laboratory spent over five years developing the Overture framework illustrated here, which provide tools for the rapid implementation of high-resolution simulations involving complex, moving geometry. Researchers at University of California campuses, New York University, Dartmouth College, Rensselaer Polytechnic Institute, University of Colorado, and Colorado State University employ this software for the simulation of complex physical phenomena. Investigation of such diverse applications as the chemistry of biological cells, the interaction of the solar wind with the earth's magnetosphere, the design and understanding of chemical vapor-deposition reactors, and combustion in diesel engines, has been enabled by these computational tools.
While Moore's law has brought incredible gains in computational speed to the desktops of U.S. scientists, equally impressive gains in computational efficiency have been delivered through the mathematical development of modern solution algorithms. Scalable linear solver algorithms, such as multigrid, are essential to effectively use the additional computational resources provided by large-scale parallel computers when solving increasingly larger problems. In many large-scale simulation codes, the majority of the run-time is spent in a linear solver. Mathematicians and computer scientists from DOE laboratories and the University of Colorado have worked together to develop multigrid algorithms and software that have demonstrated speed-ups of 10 to more than 100 times, for simulations involving groundwater flow and complex physics.
Understanding the behavior of complex molecules
Recent advances in computational resources and corresponding algorithmic improvements have enabled Lawrence Livermore scientists for the first time to study simple molecules of biochemical importance. The method employed, known as ab initio molecular dynamics, integrates Newton's equations of motion using forces derived from quantum mechanical considerations. Using high-performance computers available at the DOE laboratories, this method can now routinely be applied to the simulation of systems of several hundred atoms. It is currently used to investigate the properties of liquids and solids in extreme conditions, semiconductor surfaces, and atomic clusters, as well as biological molecules.
Smart Algorithms Focus Computational Work
Increasingly realistic 3D simulations have been made possible through the development of massively parallel computers with increasingly fast processor speeds and memory size. The fidelity of such simulations can be further enhanced by the use of modern "smart" algorithms such as adaptive mesh refinement, which automatically concentrate computational effort in the parts of the simulation where it is most needed. This technology, developed through fundamental collaborative research by computational scientists at Lawrence Berkeley National Laboratory and New York University, is used in this Lawrence Livermore simulation of laser beam filamentation in a plasma. The adaptive mesh calculation shown at the bottom required only 8% of the computer time that was required to obtain the same solution on a non-adaptive mesh. The two-dimensional computation at the top illustrates the hundreds of refined Cartesian blocks that together make up the composite adaptive grid in a fluiddynamical shock computation.
Patterns Emerge from Large Databases
Automatic pattern recognition algorithms enable scientists to identify interesting features in data that can be collected at a far greater rate than they can be assimilated and understood. Pattern recognition software developed at Lawrence Livermore enabled university researchers at the Space Telescope Science Institute to identify evidence of these previously unnoticed "bent-double" radio-emitting galaxies from a very large computer database. These techniques are helping DOE and university scientists to explore and understand their data in an effective and efficient manner.
Climate Models Help Environmental Research
Understanding the impact of industrial pollutants on the future climate of our planet is of critical importance to U.S. government policy-makers. Scientists at DOE laboratories, the NSF-funded National Center for Atmospheric Research, the University of Wisconsin and other universities are working collaboratively to develop a comprehensive, next-generation climate simulation tool. This model will incorporate the most current physics, chemistry and computational algorithms available, and will capably exploit the latest massively parallel computers. Current versions of this tool are being used to study the scientific feasibility of ocean carbon sequestration, in which greenhouse gases are "hidden" by injecting them into the deep ocean. This image shows the simulated distribution of relative CO2 concentration after 20 years of deep ocean injection near Cape Hatteras.
Software Libraries bring High-Performance Computational Tools to Scientists
Software libraries package reliable high-performance computational procedures coded for use by applications scientists who are not experts in computer algorithms and architecture. Argonne National Laboratory's Portable, Extensible Toolkit for Scientific Computing (PETSc) is a library for solving equations that arise frequently in scientific simulations. This library is actively supported on hundreds of the nation's most powerful computers -- in DOE laboratories, other federal agencies, and universities. PETSc's solver procedures are, in turn, built on another library (MPICH) for communicating data between the thousands of processors on a parallel computer. MPICH was also written at Argonne, and has an even more general distribution. Applications built on top of PETSc, such as the oil reservoir simulation and aerodynamic wing optimization shown here, have won several prizes for high performance. In turn, user demand has driven the addition of features to PETSc. Very few institutions other than DOE laboratories could assemble and maintain the team of experts required to develop and maintain such libraries, while making their efforts freely available to technical collaborators.
back to David Keyes' home page