- Distributed and Parallel Computing Systems
- Advanced Data Storage Technologies
- Particle Detector Development and Performance
- Particle physics theoretical and experimental studies
- Scientific Computing and Data Management
- Parallel Computing and Optimization Techniques
- Neutrino Physics Research
- Computational Physics and Python Applications
- Radiation Detection and Scintillator Technologies
- Medical Imaging Techniques and Applications
- High-Energy Particle Collisions Research
- Astrophysics and Cosmic Phenomena
- Quantum Computing Algorithms and Architecture
- Cloud Computing and Resource Management
- Cosmology and Gravitation Theories
- Dark Matter and Cosmic Phenomena
- Quantum Information and Cryptography
- Big Data Technologies and Applications
- Distributed systems and fault tolerance
- Quantum Chromodynamics and Particle Interactions
- Galaxies: Formation, Evolution, Phenomena
- Advanced Graph Neural Networks
- Reservoir Engineering and Simulation Methods
- Atomic and Subatomic Physics Research
- Radiation Effects in Electronics
Fermi National Accelerator Laboratory
2015-2024
University of Cincinnati
2024
Fermi Research Alliance
2007-2008
Helsinki Institute of Physics
2006
SLAC National Accelerator Laboratory
2003
Argonne National Laboratory
2002
Machine learning is an important applied research area in particle physics, beginning with applications to high-level physics analysis the 1990s and 2000s, followed by explosion of event identification reconstruction 2010s. In this document we discuss promising future development areas machine a roadmap for their implementation, software hardware resource requirements, collaborative initiatives data science community, academia industry, training community science. The main objective connect...
Pattern recognition problems in high energy physics are notably different from traditional machine learning applications computer vision. Reconstruction algorithms identify and measure the kinematic properties of particles produced collisions recorded with complex detector systems. Two critical reconstruction charged particle trajectories tracking detectors showers calorimeters. These two have unique challenges characteristics, but both dimensionality, degree sparsity, geometric layouts....
The Exa.TrkX project has applied geometric learning concepts such as metric and graph neural networks to HEP particle tracking. Exa.TrkX's tracking pipeline groups detector measurements form track candidates filters them. pipeline, originally developed using the TrackML dataset (a simulation of an LHC-inspired detector), been demonstrated on other detectors, including DUNE Liquid Argon TPC CMS High-Granularity Calorimeter. This paper documents new developments needed study physics computing...
Future "Intensity Frontier" experiments at Fermilab are likely to be conducted by smaller collaborations, with fewer scientists, than is the case for recent "Energy experiments. art a C++ event-processing framework designed needs of such in mind. An evolution from CMS experiment, was and implemented usable multiple without imposing undue maintenance effort requirements on either developers or using it. We describe key features rationale behind evolutionary changes, additions simplifications...
For the past year, HEP.TrkX project has been investigating machine learning solutions to LHC particle track reconstruction problems. A variety of models were studied that drew inspiration from computer vision applications and operated on an image-like representation tracking detector data. While these approaches have shown some promise, image-based methods face challenges in scaling up realistic HL-LHC data due high dimensionality sparsity. In contrast, can operate spacepoint measurements...
Particle track reconstruction in dense environments such as the detectors of High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem.Traditional tracking algorithms combinatorial Kalman Filter have been used with great success LHC experiments for years.However, these state-of-theart techniques are inherently sequential and scale poorly expected increases detector occupancy HL-LHC conditions.The HEP.TrkX project pilot aim to identify develop...
Quantum information science harnesses the principles of quantum mechanics to realize computational algorithms with complexities vastly intractable by current computer platforms. Typical applications range from chemistry optimization problems and also include simulations for high energy physics. The recent maturing hardware has triggered preliminary explorations several institutions (including Fermilab) capable demonstrating advantage in multiple domains, computing communications, sensing....
Machine learning has been applied to several problems in particle physics research, beginning with applications high-level analysis the 1990s and 2000s, followed by an explosion of event identification reconstruction 2010s. In this document we discuss promising future research development areas for machine physics. We detail a roadmap their implementation, software hardware resource requirements, collaborative initiatives data science community, academia industry, training community science....
APPRENTICE is a tool developed for event generator tuning. It contains range of conceptual improvements and extensions over the tuning Professor. Its core functionality remains construction multivariate analytic surrogate model to computationally expensive Monte-Carlo predictions. The used numerical optimization in chi-square minimization likelihood evaluation. Apprentice also introduces algorithms automate selection observable weights minimize effect mis-modeling generators. We illustrate...
To address the unprecedented scale of HL-LHC data, Exa.TrkX project is investigating a variety machine learning approaches to particle track reconstruction. The most promising these solutions, graph neural networks (GNN), process event as that connects measurements (detector hits corresponding nodes) with candidate line segments between (corresponding edges). Detector information can be associated nodes and edges, enabling GNN propagate embedded parameters around predict node-, edge-...
Charged particle reconstruction in dense environments, such as the detectors of High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms, combinatorial Kalman Filter, have been used with great success HEP experiments for years. However, these state-of-the-art techniques are inherently sequential and scale quadratically or worse increased detector occupancy. The HEP.TrkX project pilot aim to identify develop cross-experiment...
As larger, higher-quality quantum devices are built and demonstrated in information applications, such as computation communication, the need for high-quality memories to store states becomes ever more pressing. Future likely will use a variety of physical hardware, some being used primarily processing others storage. Here, we study correlation structure with noise models various possible memory implementations. Through numerical simulation different approximate analytical formulas applied...
Several current and proposed experiments at the Fermi National Accelerator Laboratory, Batavia, IL, USA, have novel data acquisition needs. These include 1) continuous digitization, using commercial high-speed digitizers, of signals from detectors, 2) transfer all digitized waveform to off-the-shelf (COTS) processors, 3) filtering or compression data, both, 4) writing resultant disk for later, more complete, analysis. To address these needs, members Detector Simulation Support Department...
A full High Energy Physics (HEP) data analysis is divided into multiple reduction phases. Processing within these phases extremely time consuming, therefore intermediate results are stored in files held mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive analytics. Growth size complexity experimental datasets, along emerging big tools beginning to cause changes the traditional ways doing analyses. Use for HEP looks...
Growing evidence in the scientific computing community indicates that parallel file systems are not sufficient for all HPC storage workloads. This realization has motivated extensive research new system designs. The question of which design we should turn to implies there could be a single answer satisfying wide range diverse applications. We argue such generic solution does exist. Instead, custom data services designed and tailored needs specific applications on hardware. Furthermore, close...
Several current and proposed experiments at the Fermi National Accelerator Laboratory have novel data acquisition needs. These include (1) continuous digitization, using commercial high-speed digitizers, of signals from detectors, (2) transfer all digitized waveform to commodity processors, (3) filtering or compression data, both, (4) writing resultant disk for later, more complete, analysis.
Accessing and analyzing data from cosmological simulations is a major challenge due to the prohibitive size of datasets diversity associated large-scale analysis tasks. Analysis simulated models requires direct access datasets, considerable compute infrastructure, storage capacity for results. Resource limitations can become serious obstacles performing research on most advanced simulations. The Portal Data services Cosmological Simulations (PDACS) web-based workflow service scientific...
Experimental Particle Physics has been at the forefront of analyzing worlds largest datasets for decades. The HEP community was first to develop suitable software and computing tools this task. In recent times, new toolkits systems collectively called Big Data technologies have emerged support analysis Petabyte Exabyte in industry. While principles data not changed (filtering transforming experiment-specific formats), these use different approaches promise a fresh look very large could...
We present an evaluation of the performance a Spark implementation classification algorithm in domain High Energy Physics (HEP). is general engine for in-memory, large-scale data processing, and designed applications where similar repeated analysis performed on same large sets. Classification problems are one most common critical processing tasks across many domains. Many these both computation-and data-intensive, involving complex numerical computations employing extremely evaluated Cori,...
This paper presents a graph neural network (GNN) technique for low-level reconstruction of neutrino interactions in Liquid Argon Time Projection Chamber (LArTPC). GNNs are still relatively novel technique, and have shown great promise similar tasks the LHC. In this paper, multihead attention message passing is used to classify relationship between detector hits by labelling edges, determining whether were produced same underlying particle, if so, particle type. The trained model 84% accurate...