- Galaxies: Formation, Evolution, Phenomena
- Parallel Computing and Optimization Techniques
- Astronomy and Astrophysical Research
- Cosmology and Gravitation Theories
- Radio Astronomy Observations and Technology
- Distributed and Parallel Computing Systems
- Advanced Data Storage Technologies
- Computational Physics and Python Applications
- Computer Graphics and Visualization Techniques
- Scientific Research and Discoveries
- Data Management and Algorithms
- Remote Sensing in Agriculture
- Robotics and Automated Systems
- Astronomical Observations and Instrumentation
- Algorithms and Data Compression
- Bayesian Methods and Mixture Models
- Advanced Clustering Algorithms Research
- Stellar, planetary, and galactic studies
- Embedded Systems Design Techniques
- Plant Molecular Biology Research
- Cryospheric studies and observations
- Data Visualization and Analytics
- Astrophysics and Cosmic Phenomena
- Adaptive optics and wavefront sensing
- Gamma-ray bursts and supernovae
Argonne National Laboratory
2016-2024
Argonne Leadership Computing Facility
2024
Office of Scientific and Technical Information
2022
University of Chicago
2017-2022
National Technical Information Service
2022
Northwestern University
2016-2018
Abstract This paper introduces cosmoDC2, a large synthetic galaxy catalog designed to support precision dark energy science with the Large Synoptic Survey Telescope (LSST). CosmoDC2 is starting point for second data challenge (DC2) carried out by LSST Dark Energy Science Collaboration (LSST DESC). The based on trillion-particle, (4.225 Gpc) 3 box cosmological N -body simulation, Outer Rim run. It covers 440 deg 2 of sky area redshift z = and matches expected number densities from...
We describe the Outer Rim cosmological simulation, one of largest high-resolution N-body simulations performed to date, aimed at promoting science be carried out with large-scale structure surveys. The simulation covers a volume (4.225Gpc)^3 and evolves more than trillion particles. It was executed on Mira, BlueGene/Q system Argonne Leadership Computing Facility. discuss some computational challenges posed by like many-core supercomputer, how code, HACC, has been designed overcome these...
We describe the first major public data release from cosmological simulations carried out with Argonne's HACC code. This initial covers a range of datasets large gravity-only simulations. The products include halo information for multiple redshifts, down-sampled particles, and lightcone outputs. provide two very LCDM as well beyond-LCDM spanning eleven w0-wa cosmologies. Our platform uses Petrel, research service, located at Argonne Leadership Computing Facility. Petrel offers fast transfer...
ABSTRACT We compare two state-of-the-art numerical codes to study the overall accuracy in modelling intergalactic medium and reproducing Lyman-α forest observables for DESI high-resolution data sets. The employ different approaches solving both gravity gas hydrodynamics. first code, Nyx, solves Poisson equation using Particle-Mesh (PM) method Euler equations a finite-volume method. second CRK-HACC , uses Tree-PM solve gravity, an improved Lagrangian smoothed particle hydrodynamics (SPH)...
The first generation of exascale systems will include a variety machine architectures, featuring GPUs from multiple vendors. As result, many developers are interested in adopting portable programming models to avoid maintaining versions their code. It is necessary document experiences with such assist understanding the advantages and disadvantages different approaches.
The Last Journey is a large-volume, gravity-only, cosmological N-body simulation evolving more than 1.24 trillion particles in periodic box with side-length of 5.025Gpc. It was implemented using the HACC and analysis framework on BG/Q system, Mira. parameters are chosen to be consistent results from Planck satellite. A range tools have been run situ enable diverse set science projects, at same time, keep resulting data amount manageable. Analysis outputs generated starting redshift z~10...
ABSTRACT Gravitational lensing has become one of the most powerful tools available for investigating “dark side” universe. Cosmological strong gravitational lensing, in particular, probes properties dense cores dark matter halos over decades mass and offers opportunity to study distant universe at flux levels spatial resolutions otherwise unavailable. Studies strongly lensed variable sources offer even further scientific opportunities. One challenges realizing potential is understand...
We present a study of density estimation, the conversion discrete particle positions to continuous field defined over three-dimensional Cartesian grid. The features methodology for evaluating accuracy and performance various estimation methods, results that evaluation four estimators, large-scale parallel algorithm self-adaptive method computes Voronoi tessellation as an intermediate step. demonstrate scalability our on supercomputer when estimating 100 million particles 500 billion grid points.
Abstract In this paper we introduce the Farpoint simulation, latest member of Hardware/Hybrid Accelerated Cosmology Code (HACC) gravity-only simulation family. The domain covers a volume (1000 h −1 Mpc) 3 and evolves close to two trillion particles, corresponding mass resolution m p ∼ 4.6 × 10 7 M ⊙ . These specifications enable comprehensive investigations galaxy–halo connection, capturing halos down small masses. Further, large resolves scales typical modern surveys with good statistical...
HPC is a heterogeneous world in which host and device code are interleaved throughout the application. Given significant performance advantage of accelerators, execution time becoming new bottleneck. Tuning accelerated parts consequently highly desirable but often impractical due to large overall application runtime includes unrelated parts.
Cosmological simulations produce a multitude of data types whose large scale makes them difficult to thoroughly explore in an interactive setting. One aspect particular interest scientists is the evolution groups dark matter particles, or "halos," described by merger trees. However, order fully understand subtleties trees, other derived from simulation must be incorporated as well. In this work, we develop novel linked-view visualization system that focuses on simultaneously exploring halos,...
Cosmological cluster-scale strong gravitational lensing probes the mass distribution of dense cores massive dark matter halos and structures along line sight from background sources to observer. It is frequently assumed that primary lens dominates lensing, with contribution secondary masses being neglected. Secondary may, however, affect both detectability in a given survey modify properties detected. In this paper, we utilize large cosmological N-body simulation multiple plane (and many...
Precision measurements of the large scale structure Universe require numbers high fidelity mock catalogs to accurately assess, and account for, presence systematic effects. We introduce test a scheme for generating rapidly using suitably derated N-body simulations. Our aim is reproduce gross properties dark matter halos with accuracy, while sacrificing details internal halos. By adjusting global local time-steps in an code, we demonstrate that recover halo masses better than 0.5% power...
This paper will evaluate the progress being made on achieving performance portability by a sub-set of ECP applications, or their related mini-apps, across diverse spectrum applications domains and approaches to portability. The mini-apps evaluated are AMR-Wind, HACC, SW4, GAMESS RI-MP2, XSBench, TestSNAP. These codes redeveloped using SYCL, OpenMP, RAJA, Kokkos programming models, AMReX framework in this we assess AMD MI100, Intel Gen9, NVIDIA A100 GPUs. Since each GPU has different...
The k -medoids methods for modeling clustered data have many desirable properties such as robustness to noise and the ability use non-numerical values, however, they are typically not applied large datasets due their associated computational complexity. In this paper, we present AGORAS, a novel heuristic algorithm k-medoids problem where algorithmic complexity is driven by, k, number of clusters, rather than, n, points. Our attempts isolate sample from each individual cluster within sequence...
We improve the interpolation accuracy and efficiency of Delaunay tessellation field estimator (DTFE) for surface density reconstruction by proposing an algorithm that takes advantage adaptive triangular mesh line-of-sight integration. The costly computation intermediate 3D grid is completely avoided our method only optimally chosen points are computed, thus, overall computational cost significantly reduced. implemented as a parallel shared-memory kernel large-scale rendered reconstructions...
Abstract This paper introduces Subhalo Mass-loss Analysis using Core Catalogs (SMACC). SMACC adds a mass model to substructure merger trees based on halo “core tracking.” Our approach avoids the need for running expensive subhalo finding algorithms and instead uses mass-loss modeling assign masses cores. We present details of methodology demonstrate its excellent performance in describing evolution. Validation is carried out cosmological simulations at significantly different resolutions....
While the computing power of supercomputers continues to improve at an astonishing rate, companion I/O systems are struggling keep up in performance. To mitigate performance gap, several supercomputing have been configured incorporate burst buffers into their stack; exact role which, however, still remains unclear. In this paper, we examine features and study impact on application Our goal is demonstrate that can be utilized by parallel libraries significantly end, developed driver PnetCDF...
Cosmological N-body simulations rank among the most computationally intensive efforts today. A key challenge is analysis of structure, substructure, and merger history for many billions compact particle clusters, called halos. Effectively representing merging halos essential galaxy formation models used to generate synthetic sky catalogs, an important application modern cosmological simulations. Generating realistic mock catalogs requires computing halo from with large volumes over time...
The galaxy distribution in dark matter-dominated halos is expected to approximately trace the details of underlying matter substructure. In this paper we introduce halo `core-tracking' as a way efficiently follow small-scale substructure cosmological simulations and apply technique model observed clusters. method relies on explicitly tracking set particles identified belonging halo's central density core, once has attained certain threshold mass. cores are then followed throughout entire...
The galaxy distribution in dark matter-dominated halos is expected to approximately trace the details of underlying matter substructure. In this paper we introduce halo "core-tracking" as a way efficiently follow small-scale substructure cosmological simulations and apply technique model observed clusters. method relies on explicitly tracking set particles identified belonging halo's central density core, once has attained certain threshold mass. cores are then followed throughout entire...
The first generation of exascale systems will include a variety machine architectures, featuring GPUs from multiple vendors. As result, many developers are interested in adopting portable programming models to avoid maintaining versions their code. It is necessary document experiences with such assist understanding the advantages and disadvantages different approaches. To this end, paper evaluates performance portability SYCL implementation large-scale cosmology application (CRK-HACC)...