- Galaxies: Formation, Evolution, Phenomena
- Gamma-ray bursts and supernovae
- Astronomy and Astrophysical Research
- Scientific Research and Discoveries
- Cellular Mechanics and Interactions
- Stellar, planetary, and galactic studies
- Advanced Fluorescence Microscopy Techniques
- Pulsars and Gravitational Waves Research
- Cosmology and Gravitation Theories
- Computational Physics and Python Applications
- Adaptive optics and wavefront sensing
- Spacecraft and Cryogenic Technologies
- Parallel Computing and Optimization Techniques
- Big Data Technologies and Applications
- Advanced Frequency and Time Standards
- Impact of Light on Environment and Health
- Solar and Space Plasma Dynamics
- 3D Printing in Biomedical Research
- Advanced Data Storage Technologies
- Advanced Surface Polishing Techniques
- Astrophysics and Cosmic Phenomena
- Optical Systems and Laser Technology
- Scientific Measurement and Uncertainty Evaluation
- Force Microscopy Techniques and Applications
- Geophysics and Gravity Measurements
Argonne National Laboratory
2019-2023
University of Chicago
2019-2021
University of Florida
2010-2013
Florida Museum of Natural History
2011-2012
Goddard Space Flight Center
2011
North Florida/South Georgia Veterans Health System
2010-2011
Abstract This paper introduces cosmoDC2, a large synthetic galaxy catalog designed to support precision dark energy science with the Large Synoptic Survey Telescope (LSST). CosmoDC2 is starting point for second data challenge (DC2) carried out by LSST Dark Energy Science Collaboration (LSST DESC). The based on trillion-particle, (4.225 Gpc) 3 box cosmological N -body simulation, Outer Rim run. It covers 440 deg 2 of sky area redshift z = and matches expected number densities from...
We describe the Outer Rim cosmological simulation, one of largest high-resolution N-body simulations performed to date, aimed at promoting science be carried out with large-scale structure surveys. The simulation covers a volume (4.225Gpc)^3 and evolves more than trillion particles. It was executed on Mira, BlueGene/Q system Argonne Leadership Computing Facility. discuss some computational challenges posed by like many-core supercomputer, how code, HACC, has been designed overcome these...
Abstract We describe the simulated sky survey underlying second data challenge (DC2) carried out in preparation for analysis of Vera C. Rubin Observatory Legacy Survey Space and Time (LSST) by LSST Dark Energy Science Collaboration (LSST DESC). Significant connections across multiple science domains will be a hallmark LSST; DC2 program represents unique modeling effort that stresses this interconnectivity way has not been attempted before. This encompasses full end-to-end approach: starting...
We describe the first major public data release from cosmological simulations carried out with Argonne's HACC code. This initial covers a range of datasets large gravity-only simulations. The products include halo information for multiple redshifts, down-sampled particles, and lightcone outputs. provide two very LCDM as well beyond-LCDM spanning eleven w0-wa cosmologies. Our platform uses Petrel, research service, located at Argonne Leadership Computing Facility. Petrel offers fast transfer...
Space-based gravitational wave detectors are conceived to detect waves in the low frequency range by measuring distance between proof masses spacecraft separated millions of kilometers. One key elements is telescope which has have a dimensional stability better than 1 pm Hz−1/2 at 3 mHz. In addition, structure must be light, strong, and stiff. For this reason potential consisting silicon carbide quadpod been designed, constructed, tested. We present results meeting requirements room...
As part of the effort to meet needs Large Synoptic Survey Telescope Dark Energy Science Collaboration (LSST DESC) for accurate, realistically complex mock galaxy catalogs, we have developed GalSampler, an open-source python package that assists in generating large volumes synthetic cosmological data. The key idea behind GalSampler is recast hydrodynamical simulations and semi-analytic models as physically-motivated libraries. populates a new, larger-volume halo catalog with galaxies drawn...
The Last Journey is a large-volume, gravity-only, cosmological N-body simulation evolving more than 1.24 trillion particles in periodic box with side-length of 5.025Gpc. It was implemented using the HACC and analysis framework on BG/Q system, Mira. parameters are chosen to be consistent results from Planck satellite. A range tools have been run situ enable diverse set science projects, at same time, keep resulting data amount manageable. Analysis outputs generated starting redshift z~10...
Large simulation efforts are required to provide synthetic galaxy catalogs for ongoing and upcoming cosmology surveys. These extragalactic being used many diverse purposes covering a wide range of scientific topics. In order be useful, they must offer realistically complex information about the galaxies contain. Hence, it is critical implement rigorous validation procedure that ensures simulated properties faithfully capture observations delivers an assessment level realism attained by...
The laser interferometer space antenna (LISA) is a mission designed to detect low frequency gravitational waves. In order for LISA succeed in its goal of direct measurement waves, many subsystems must work together measure the distance between proof masses on adjacent spacecraft. One such subsystem, telescope, plays critical role as it transmission and reception link Not only material that makes up telescope support structure be strong, stiff, light, but have dimensional stability better...
In preparation for cosmological analyses of the Vera C. Rubin Observatory Legacy Survey Space and Time (LSST), LSST Dark Energy Science Collaboration (LSST DESC) has created a 300 deg$^2$ simulated survey as part an effort called Data Challenge 2 (DC2). The DC2 sky survey, in six optical bands with observations following reference observing cadence, was processed Pipelines (19.0.0). this Note, we describe public data release resulting object catalogs coadded images five years along...
The galaxy distribution in dark matter-dominated halos is expected to approximately trace the details of underlying matter substructure. In this paper we introduce halo "core-tracking" as a way efficiently follow small-scale substructure cosmological simulations and apply technique model observed clusters. method relies on explicitly tracking set particles identified belonging halo's central density core, once has attained certain threshold mass. cores are then followed throughout entire...
The galaxy distribution in dark matter-dominated halos is expected to approximately trace the details of underlying matter substructure. In this paper we introduce halo `core-tracking' as a way efficiently follow small-scale substructure cosmological simulations and apply technique model observed clusters. method relies on explicitly tracking set particles identified belonging halo's central density core, once has attained certain threshold mass. cores are then followed throughout entire...