Dark Sky Simulations Collaboration
Our large simulation originated with a DOE INCITE proposal for
2014. The original motivation is documented nicely here:
http://dx.doi.org/10.5281/zenodo.10777
For convenience, the text is shown below:
The Dark Sky Simulations are an ongoing series of cosmological N-body
simulations designed to provide a quantitative and accessible model of the
evolution of the large-scale Universe. Cosmological simulations are the
cornerstone of theoretical analysis of structure in the Universe from scales of
kiloparsecs to gigaparsecs. Predictions from numerical models are critical to
almost every aspect of the studies of dark matter and dark energy, due to the
intrinsically non-linear gravitational evolution of matter. During the next few
years, projects such as Pan-STARRS, the South Pole Telescope (SPT) and the Dark
Energy Survey (DES) will measure the spatial distribution of large-scale
structure in enormous volumes of space across billions of years of cosmic
evolution. At the other extreme (sub-galactic and galactic scales from 100
parsecs to a megaparsec) understanding the distribution of dark matter within
Milky Way type halos is necessary to interpret the results of Earth-based dark
matter detection experiments. The revolutionary transformation of cosmology
from a qualitative to a quantitative science has occurred over just the last
twenty years. Driven by a diverse suite of observations, the parameters
describing the large-scale Universe are now known to near 1% precision. Yet,
the precise nature of dark matter and dark energy remain a mystery, and are
unquestionably among the most important unsolved problems in physics. Advances
in modeling must keep pace with observational advances if we are to understand
the Universe which led to these observations.
We have achieved superior performance on multiple generations of the fastest
supercomputers in the world with our hashed oct-tree N-body code (HOT),
spanning two decades and garnering multiple Gordon Bell Prizes for significant
achievement in parallel processing. Using several new integrated and
innovative algorithmic and computational science advances embodied in version 2
of the code (2HOT), combined with a unified data analysis effort based on the
widely-adopted yt project, we propose a far-reaching set of scientific goals.
We additionally aim to advance the state-of-the-art in domain decomposition and
hierarchical tree-based computational techniques relevant to many simulation
and data analysis problems. We will address a wide range of scientifically
relevant tests of the standard cosmological model, including measurements of
cluster abundance, void statistics, baryon acoustic oscillations,
redshift-space distortions, velocity statistics and gravitational lensing. At
small scales, we will test the abundance and central kinematics of the dwarf
spheroidal galaxy satellites and related small-scale gravitational physics
which determine the expected signal for dark matter detection experiments. Our
simulations will produce an unprecedented suite of accurate and reliable halo,
sub-halo and mock galaxy catalogs, which we will make publicly available.
People
Michael Warren (LANL)
Alexander Friedland (LANL)
Daniel Holz (KITP, Chicago)
Samuel Skillman (KIPAC, Stanford)
Paul Sutter (Paris Institute of Astrophysics)
Matthew Turk (NCSA, UIUC)
Risa Wechsler (KIPAC, Stanford)