Abstract

Astrophysics is transforming from a data-starved to a data-swamped discipline, fundamentally changing the nature of scientific inquiry and discovery. New technologies are enabling the detection, transmission, and storage of data of hitherto unimaginable quantity and quality across the electromagnetic, gravity and particle spectra. The observational data obtained in the next decade alone will supersede everything accumulated over the preceding four thousand years of astronomy. Within the next year there will be no fewer than 4 large-scale photometric and spectroscopic surveys underway, each generating and/or utilizing tens of terabytes of data per year. Some will focus on the static universe while others will greatly expand our knowledge of transient phenomena. Maximizing the science from these programs requires integrating the processing pipeline with high-performance computing resources coupled to large astrophysics databases with near real-time turnaround. Here, I will present an overview of the history of transient studies in astronomy and the first of these programs, which fundamentally changed the way we studied these phenomena - The Palomar Transient Factory (PTF). In particular, I will highlight how PTF has enabled a much more robust nearby supernova program, allowing us to carry out next generation cosmology programs with both Type Ia and II-P supernovae, while at the same time discovering events that previously exploded only in the minds of theorists. I will also discuss the synergy between these programs and future spectroscopic surveys like the Dark Energy Spectroscopic Instrument.

Video Recording