Friday, August 28, 2009 3:30PM
ESS Building, Room 123
Computational Astrophysics in the 21st Century
Abstract: Astrophysics is transforming from a data-starved to a data-swamped discipline, fundamentally changing the nature of scientific inquiry and discovery. New technologies are enabling the detection, transmission, and storage of data of hitherto unimaginable quantity and quality across the electromagnetic, gravity and particle
spectra. The observational data obtained in the next decade alone will supersede everything accumulated over the preceding four thousand years of astronomy. Realizing the full scientific potential of these observations will require correspondingly precise predictions from our theoretical models. Given the physical complexity of the systems
involved, obtaining such predictions necessarily requires ever more detailed numerical modeling, and simultaneously generates an equivalent quantity of simulation-data.
Computational astrophysics has an essential role to play in providing the point of contact between theory and observation. From the detailed theoretical predictions made possible by complex simulations, to the precise reference points obtained from painstaking analyses of the new observations, the development of astrophysics in the new millennium
will be regulated by our computational capability. Here I will describe several ongoing programs in both observational and theoretical astrophysics for which high performance computing is mandatory. Specific attention will be paid towards how these programs
have evolved in the past decade due to changes in computer architectures, increased data volumes and scientific goals, and the implications for future generation surveys such as DES, JDEM and LSST.