L. Tompkins
- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Particle Detector Development and Performance
- Quantum Chromodynamics and Particle Interactions
- Dark Matter and Cosmic Phenomena
- Computational Physics and Python Applications
- Neutrino Physics Research
- Cosmology and Gravitation Theories
- Distributed and Parallel Computing Systems
- Medical Imaging Techniques and Applications
- Radiation Detection and Scintillator Technologies
- Advanced Data Storage Technologies
- Astrophysics and Cosmic Phenomena
- advanced mathematical theories
- Atomic and Subatomic Physics Research
- Algorithms and Data Compression
- Muon and positron interactions and applications
- Structural Analysis of Composite Materials
- Nuclear Physics and Applications
- Black Holes and Theoretical Physics
- Particle Accelerators and Free-Electron Lasers
- Advanced X-ray and CT Imaging
- Antenna Design and Analysis
- Big Data Technologies and Applications
- Scientific Computing and Data Management
SLAC National Accelerator Laboratory
2016-2025
Stanford University
2015-2025
University of California, Santa Cruz
2023-2024
Istanbul University
2023-2024
A. Alikhanyan National Laboratory
2024
Atlas Scientific (United States)
2024
The University of Adelaide
2019-2023
Northern Illinois University
2017-2023
Boğaziçi University
2023
The University of Melbourne
2020
Abstract The reconstruction of the trajectories charged particles, or track reconstruction, is a key computational challenge for particle and nuclear physics experiments. While tuning algorithms can depend strongly on details detector geometry, currently in use by experiments share many common features. At same time, intense environment High-Luminosity LHC accelerator other future expected to put even greater stress software, motivating development more performant algorithms. We present here...
The work contained herein constitutes a report of the ``Beyond Standard Model'' working group for Workshop "Physics at TeV Colliders", Les Houches, France, 26 May--6 June, 2003. research presented is original, and was performed specifically workshop. Tools calculations in minimal supersymmetric standard model are presented, including comparison dark matter relic density predicted by public codes. Reconstruction particle masses LHC future linear collider facility examined. Less orthodox...
The constituents of dark matter are still unknown, and the viable possibilities span a vast range masses. physics community has established searching for sub-GeV as high priority identified accelerator-based experiments an essential facet this search strategy. A key goal program is testing broad idea thermally produced through designed to directly produce particles. most sensitive way production light use primary electron beam it in fixed-target collisions. Light Dark Matter eXperiment...
We are studying the use of deep neural networks (DNNs) to identify and locate primary vertices (PVs) in proton-proton collisions at LHC. Earlier work focused on finding simulated LHCb data using a hybrid approach that started with kernel density estimators (KDEs) derived heuristically from ensemble charged track parameters predicted “target histogram” proxies, which actual PV positions extracted. have recently demonstrated UNet architecture performs indistinguishably “flat” convolutional...
The reconstruction of the trajectories charged particles, or track reconstruction, is a key computational challenge for particle and nuclear physics experiments. While tuning algorithms can depend strongly on details detector geometry, currently in use by experiments share many common features. At same time, intense environment High-Luminosity LHC accelerator other future expected to put even greater stress software, motivating development more performant algorithms. We present here A Common...
Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment`s complexity, accelerator backgrounds and luminosity increase we need increasingly complex exclusive event selection. We present first prototype of a new Processing Unit (PU), core FastTracker processor (FTK). FTK is real time tracking device ATLAS trigger upgrade. The computing power PU such that few hundred them will be able to reconstruct all tracks with transverse momentum...
During the current LHC shutdown period ATLAS experiment will upgrade Trigger and Data Acquisition system to include a hardware tracker coprocessor: Fast TracKer (FTK). The FTK receives data from 80 million of channels silicon detector, identifying charged tracks reconstructing their parameters at rate up 100 KHz within microseconds. To achieve this performance, identifies candidate utilizing computing power custom ASIC chip with associative memory (AM) designed perform "pattern matching"...
Abstract Particle track reconstruction, in which the trajectories of charged particles are determined, is a critical and time consuming component full event reconstruction chain. The underlying software complex consists number mathematically intense algorithms, each dealing with particular tracking sub-process. These algorithms have many input parameters that need to be supplied advance. However, it difficult determine configuration these produces best performance. Currently, parameter...
The use of tracking information at the trigger level in LHC Run II period is crucial for and data acquisition system will be even more so as contemporary collisions that occur every bunch crossing increase III. Fast TracKer part ATLAS upgrade project; it a hardware processor provide Level-1 accepted event (100 kHz) within 100μs, full tracks with momentum low 1 GeV . Providing fast, extensive access to information, resolution comparable offline reconstruction, FTK help precise detection...
The ATLAS Fast TracKer is a custom electronics system that will operate at the full Level-1 accept trigger rate, 100 kHz, to provide high quality tracks as input level trigger. event reconstruction performed in hardware, thanks massive parallelism of associative memories and FPGAs. We present advantages for physics goals experiment LHC well recent results on design, technological advancements test some core components used processor.
Particle tracking is among the most sophisticated and complex part of full event reconstruction chain. A number algorithms work in a sequence to build these trajectories from detector hits. Each use many configuration parameters that need be fine-tuned properly account for detector/experimental setup, available CPU budget desired physics performance. Few examples such include cut values limiting search space algorithm, approximations accounting phenomena or controlling algorithm The popular...
The Fast Tracker (FTK) processor is an approved ATLAS upgrade that will reconstruct tracks using the full silicon tracker at Level-1 rate (up to 100 KHz). FTK uses a completely parallel approach read information, execute pattern matching and tracks. This approach, according detailed simulation results, allows tracking with nearly offline resolution within execution time of 100μs. A central component system associative memories (AM); these special devices reduce combinatoric problem,...
The extended use of tracking information at the trigger level in LHC is crucial for and data acquisition (TDAQ) system to fulfill its task.Precise fast important identify specific decay products Higgs boson or new phenomena, as well distinguish contributions coming from many collisions that occur every bunch crossing.However, track reconstruction among most demanding tasks performed by TDAQ computing farm; fact, complete full Level-1 accept rate (100 kHz) not possible.In order overcome this...
The reconstruction of charged particle trajectories, known as tracking, is one the most complex and CPU consuming parts event processing in high energy physics experiments. widely used best performing tracking algorithms require significant geometry-specific tuning algorithm parameters to achieve results. In this paper, we demonstrate usage machine learning techniques, particularly evolutionary algorithms, find configurations for first step called track seeding. We use a seeding from...
The Heavy Photon Search (HPS) experiment is designed to search for a new vector boson $A^\prime$ in the mass range of 20 MeV/$c^2$ 220 that kinetically mixes with Standard Model photon couplings $\epsilon^2 >10^{-10}$. In addition general importance exploring light, weakly coupled physics difficult probe high-energy colliders, prime motivation this possibility sub-GeV thermal relics constitute dark matter, scenario requires comparably light mediator, where models hidden $U(1)$ gauge...
A first measurement of the inelastic cross-section proton-proton collisions at sqrt{s} =7 TeV using ATLAS detector Large Hadron Collider is presented. The made scintillators in forward region detector. Prospects for elastic measurements are also discussed.
This paper presents a method to achieve backplane data transfer capabilities and from VME Rear Transition Module (RTM), while being in full compliance with the VME64/VIPA Specification. The bus is specified for transfers only boards plugged on front side of subrack. RTMs receive power crate, but are not part actual bus. They do plug into J1 connector, have no access interface logic. In this implementation, RTM uses feed-through pins located J2 connector communicate corresponding module....
Particle tracking is among the most sophisticated and complex part of full event reconstruction chain. A number algorithms work in a sequence to build these trajectories from detector hits. These use many configuration parameters that need be fine-tuned properly account for detector/experimental setup, available CPU budget desired physics performance. The popular method tune hand-tuning using brute-force techniques. techniques can inefficient raise issues long-term maintainability such...
We are studying the use of deep neural networks (DNNs) to identify and locate primary vertices (PVs) in proton-proton collisions at LHC. Earlier work focused on finding simulated LHCb data using a hybrid approach that started with kernel density estimators (KDEs) derived heuristically from ensemble charged track parameters predicted "target histogram" proxies, which actual PV positions extracted. have recently demonstrated UNet architecture performs indistinguishably "flat" convolutional...