- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Computational Physics and Python Applications
- Dark Matter and Cosmic Phenomena
- Neutrino Physics Research
- Cosmology and Gravitation Theories
- Medical Imaging Techniques and Applications
- Distributed and Parallel Computing Systems
- Astrophysics and Cosmic Phenomena
- Radiation Detection and Scintillator Technologies
- Advanced Data Storage Technologies
- Parallel Computing and Optimization Techniques
- Black Holes and Theoretical Physics
- Scientific Computing and Data Management
- Gamma-ray bursts and supernovae
- Particle Accelerators and Free-Electron Lasers
- Big Data Technologies and Applications
- Algorithms and Data Compression
- CCD and CMOS Imaging Sensors
- Atomic and Subatomic Physics Research
- Stochastic processes and financial applications
- Nuclear Physics and Applications
- Data Quality and Management
European Organization for Nuclear Research
2015-2025
A. Alikhanyan National Laboratory
2022-2024
Institute of High Energy Physics
2022-2024
University of Antwerp
2024
École Normale Supérieure de Lyon
2024
Istanbul University
2022-2023
Fermi National Accelerator Laboratory
2019
University of Milano-Bicocca
2011
Istituto Nazionale di Fisica Nucleare
2007
University of Insubria
2005-2006
The High-Luminosity upgrade of the Large Hadron Collider (LHC) will see accelerator reach an instantaneous luminosity 7 × 10 34 cm −2 s −1 with average pileup 200 proton-proton collisions. These conditions pose unprecedented challenge to online and offline reconstruction software developed by experiments. computational complexity exceed far expected increase in processing power for conventional CPUs, demanding alternative approach. Industry High-Performance Computing (HPC) centers are...
One of the challenges high granularity calorimeters, such as that to be built cover endcap region in CMS Phase-2 Upgrade for HL-LHC, is large number channels causes a surge computing load when clustering numerous digitized energy deposits (hits) reconstruction stage. In this article, we propose fast and fully parallelizable density-based algorithm, optimized high-occupancy scenarios, where clusters much larger than average hits cluster. The algorithm uses grid spatial index querying...
The Data Quality Monitoring software is a central tool in the CMS experiment. It used following key environments: (i) Online, for real-time detector monitoring; (ii) Offline, prompt-offline-feedback and final fine-grained data quality analysis certification; (iii) Validation of all reconstruction production releases; (iv) Monte Carlo productions. Though basic structure Run1 DQM system remains same Run2, between Run2 periods, underwent substantial upgrades many areas, not only to adapt...
In this work a detailed spectral analysis for the periodicity search of time series $^{8}\mathrm{B}$ solar neutrino flux released by SNO Collaboration is presented. The data have been publicly with truncation event times to unit day (1 binning); they are thus suited undergo traditional Lomb-Scargle investigation, as well an extension such method based on likelihood approach. results presented here confirm absence modulation signatures in data. For completeness, more refined ``1 binned'' also...
The advent of computing resources with co-processors, for example Graphics Processing Units (GPU) or Field-Programmable Gate Arrays (FPGA), use cases like the CMS High-Level Trigger (HLT) data processing at leadership-class supercomputers imposes challenges current frameworks. These include developing a model algorithms to offload their computations on co-processors as well keeping traditional CPU busy doing other work. framework, CMSSW, implements multithreading using Intel Threading...
The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its flexibility allows for integration several key environments: Online, real-time detector monitoring; Offline, final, fine-grained data analysis and certification; Release-Validation, to constantly validate functionality performance of reconstruction software; Monte Carlo productions. Since end taking at center mass energy 8 TeV, environment which DQM lives has undergone fundamental changes. In turn, system...
The CMS tracking code is organized in several levels, known as iterative steps, each optimized to reconstruct a class of particle trajectories, the ones particles originating from primary vertex or displaced tracks resulting secondary vertices. Each step consists seeding, pattern recognition and fitting by kalman filter, final filtering cleaning. subsequent works on hits not yet associated reconstructed trajectory.
Abstract We present an important milestone for the CMS High Granularity Calorimeter (HGCAL) event reconstruction: deployment of GPU clustering algorithm (CLUE) to software. The connection between CLUE and preceding calibration step is thus made possible, further extending heterogeneous chain HGCAL’s reconstruction framework. In addition improvements brought by CLUE’s deployment, new recursive device kernels are added efficiently calculate position energy clusters. Data conversions CPU...
The future High Luminosity LHC (HL-LHC) is expected to deliver about 5 times higher instantaneous luminosity than the present LHC, resulting in pile-up up 200 interactions per bunch crossing (PU200). As part of phase-II upgrade program, CMS collaboration developing a new endcap calorimeter system, Granularity Calorimeter (HGCAL), featuring highly-segmented hexagonal silicon sensors and scintillators with more 6 million channels. For each event, HGCAL clustering algorithm needs group 10 hits...
Abstract To sustain the harsher conditions of high-luminosity LHC [1], CMS Collaboration [2] is designing a novel endcap calorimeter system [3]. The new will predominantly use silicon sensors to achieve sufficient radiation tolerance and maintain highly granular information in readout help mitigate effects pile up. In regions characterized by lower levels, small scintillator tiles with individual SiPM on-tile are employed. A unique reconstruction framework (TICL: Iterative CLustering) being...
High granularity calorimeters have become increasingly crucial in modern particle physics experiments, and their importance is set to grow even further the future. The CLUstering of Energy (CLUE) algorithm has shown excellent performance clustering calorimeter hits Granularity Calorimeter (HGCAL) developed for Phase-2 upgrade CMS experiment. In this paper, suitability CLUE future collider experiments been investigated its capabilities tested outside HGCAL reconstruction software. To end, a...
The High-Luminosity upgrade of the LHC will see accelerator reach an instantaneous luminosity $7\times 10^{34} cm^{-2}s^{-1}$ with average pileup $200$ proton-proton collisions. These conditions pose unprecedented challenge to online and offline reconstruction software developed by experiments. computational complexity exceed far expected increase in processing power for conventional CPUs, demanding alternative approach. Industry High-Performance Computing (HPC) centres are successfully...
We present the porting to heterogeneous architectures of algorithm used for applying linear transformations raw energy deposits in CMS High Granularity Calorimeter (HGCAL). This is first be fully integrated with HGCAL’s reconstruction chain. After introducing latter and giving a brief description structural components HGCAL relevant this work, role calibration reviewed. The many ways which parallelization achieved are described, successful validation covered. Detailed performance...
The configuration of the CMS Pixel detector consists in a complex set data that uniquely define its startup condition and optimized calibration constants.Since several these conditions are used to both calibrate over time properly initialize it for physics run, all have been collected suitably designed database historical archival retrieval.In this paper we present description underlying schema with particular emphasis on architecture implementation web-based interface allows very...
High granularity calorimeters have become increasingly crucial in modern particle physics experiments, and their importance is set to grow even further the future. The CLUstering of Energy (CLUE) algorithm has shown excellent performance clustering calorimeter hits Granularity Calorimeter (HGCAL) developed for Phase-2 upgrade CMS experiment. In this paper, we investigate suitability CLUE future collider experiments test its capabilities outside HGCAL software reconstruction. To end, a new...