- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Dark Matter and Cosmic Phenomena
- Cosmology and Gravitation Theories
- Neutrino Physics Research
- Computational Physics and Python Applications
- Distributed and Parallel Computing Systems
- Advanced Data Storage Technologies
- Astrophysics and Cosmic Phenomena
- Black Holes and Theoretical Physics
- Medical Imaging Techniques and Applications
- Radiation Detection and Scintillator Technologies
- Parallel Computing and Optimization Techniques
- Particle Accelerators and Free-Electron Lasers
- Scientific Computing and Data Management
- Interconnection Networks and Systems
- Atomic and Subatomic Physics Research
- Nuclear reactor physics and engineering
- Radiation Effects in Electronics
- Radiation Therapy and Dosimetry
- Big Data Technologies and Applications
- Gamma-ray bursts and supernovae
- Astronomy and Astrophysical Research
University of California, San Diego
2016-2025
University of California System
2024-2025
Laboratoire d’Astrophysique de Marseille
2023-2025
Institute of High Energy Physics
2015-2024
European Organization for Nuclear Research
2015-2024
A. Alikhanyan National Laboratory
2022-2024
Integrated BioTherapeutics (United States)
2024
University of Antwerp
2024
Château Gombert
2023
University of California, Santa Barbara
2014-2023
Abstract Baryon Acoustic Oscillations can be measured with sub-percent precision above redshift two the Lyman- α (Ly ) forest auto-correlation and its cross-correlation quasar positions. This is one of key goals Dark Energy Spectroscopic Instrument (DESI) which started main survey in May 2021. We present this paper a study contaminants to Ly are mainly caused by correlated signals introduced spectroscopic data processing pipeline as well astrophysical due foreground absorption intergalactic...
We present the FP420 R&D project, which has been studying key aspects of development and installation a silicon tracker fast-timing detectors in LHC tunnel at 420 m from interaction points ATLAS CMS experiments. These would measure precisely very forward protons conjunction with corresponding central as means to study Standard Model (SM) physics, search for characterise new physics signals. This report includes detailed description case detector and, particular, measurement Central Exclusive...
Abstract The deposits of the Pliocene—Quaternary foredeep Northern Apennine cover at present an area 103000 km 2 . original boundaries basin are not known, since marginal have been eroded, in particular those inner, southwestern border. During Pliocene times was reduced by thrust tectonics and amount shortening may be tentatively estimated. volume Quaternary sediments inferred with good approximation from maps base (base Hyalinea balthica Zone) successions. has corrected adding estimate...
Abstract For the Phase-2 upgrade of CMS experiment, central DAQ group designed and developed two custom ATCA boards. These boards provide interfaces between sub-detector electronics systems. This paper describes our experience with chosen prototyping strategy, a focus on design modification choices made along way. It concludes brief overview recent firmware developments, look at transition towards full board production.
The CMS data acquisition system is made of two major subsystems: event building and filter. presented paper describes the architecture design software that processes flow in currently operating experiment. central DAQ relies on industry standard networks processing equipment. Adopting a single infrastructure all subsystems experiment imposes, however, number different requirements. High efficiency configuration flexibility are among most important ones. XDAQ has matured over an eight years...
The Compact Muon Solenoid (CMS) experiment operating at the CERN (European Laboratory for Nuclear Physics) Large Hadron Collider (LHC) is in process of upgrading several its detector systems. Adding more individual components brings need to test and commission those separately from existing ones so as not compromise physics data-taking. CMS Trigger, Timing Control (TTC) system had reached limits terms number separate elements (partitions) that could be supported. A new Distribution System...
The data acquisition (DAQ) system of the CMS experiment at CERN Large Hadron Collider assembles events a rate 100 kHz, transporting event an aggregate throughput GB/s to high level trigger (HLT) farm. HLT farm selects interesting for storage and offline analysis around 1 kHz. DAQ has been redesigned during accelerator shutdown in 2013/14. motivation is twofold: Firstly, current compute nodes, networking, infrastructure will have reached end their lifetime by time LHC restarts. Secondly,...
The data-acquisition system of the CMS experiment at LHC performs read-out and assembly events accepted by first level hardware trigger. Assembled are made available to high-level trigger which selects interesting for offline storage analysis. is designed handle a maximum input rate 100 kHz an aggregated throughput 100GB/s originating from approximately 500 sources. An overview architecture design software DAQ given. We discuss performance operational experience months physics data taking.
For the upgrade of DAQ CMS experiment in 2013/2014 an interface between custom detector Front End Drivers (FEDs) and new eventbuilder network has to be designed. a loss-less data collection from more then 600 FEDs FPGA based card implementing TCP/IP protocol suite over 10Gbps Ethernet been developed. We present hardware challenges modifications made TCP order simplify its implementation together with set performance measurements which were carried out current prototype.
The DAQ system of the CMS experiment at CERN collects data from more than 600 custom detector Front-End Drivers (FEDs). During 2013 and 2014 will undergo a major upgrade to address obsolescence current hardware requirements posed by LHC accelerator various components. For loss-less collection FEDs new FPGA based card implementing TCP/IP protocol suite over 10Gbps Ethernet has been developed. To limit TCP implementation complexity group developed simplified unidirectional but RFC 793...
The CMS data acquisition system is designed to build and filter events originating from 476 detector sources at a maximum trigger rate of 100 kHz. Different architectures switch technologies have been evaluated accomplish this purpose. Events will be built in two stages: the first stage set event builders called front-end driver (FED) builders. These based on Myrinet technology pre-assemble groups about eight sources. second readout perform building full events. A single builder 60 16 kB...
Summary form only given. The data acquisition system (DAQ) of the CMS experiment at CERN Large Hadron Collider assembles events a rate 100 kHz, transporting event an aggregate throughput GB/s to high level trigger (HLT) farm. HLT farm selects interesting for storage and offline analysis around 1 kHz. DAQ has been redesigned during accelerator shutdown in 2013/14. motivation is twofold: Firstly, current compute nodes, networking, infrastructure will have reached end their lifetime by time LHC...
Abstract This paper describes recent progress on the design of DAQ and Timing Hub, or DTH, an ATCA (Advanced Telecommunications Computing Architecture) hub board intended for phase-2 upgrade CMS experiment. Prototyping was originally divided into multiple feature lines, spanning all different aspects DTH functionality. The second prototype merges R&D prototyping lines a single board, which is to be production candidate. Emphasis process experience in going from first prototype, included...
The CMS data acquisition (DAQ) is implemented as a service-oriented architecture where DAQ applications, well general applications such monitoring and error reporting, are run self-contained services. task of deployment operation services achieved by using several heterogeneous facilities, custom configuration scripts in languages. In this work, we restructure the existing system into homogeneous, scalable cloud adopting uniform paradigm, all orchestrated environment with standardized...
The Compact Muon Solenoid (CMS) experiment at CERN incorporates one of the highest throughput data acquisition systems in world and is expected to increase its by more than a factor ten for High-Luminosity phase Large Hadron Collider (HL-LHC). To achieve this goal, system will be upgraded most components. Among them, event builder software, charge assembling all read out from different sub-detectors, planned modified single an orbit that assembles multiple events same time. increased current...