- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Dark Matter and Cosmic Phenomena
- Cosmology and Gravitation Theories
- Computational Physics and Python Applications
- Neutrino Physics Research
- Black Holes and Theoretical Physics
- Distributed and Parallel Computing Systems
- Astrophysics and Cosmic Phenomena
- Radiation Detection and Scintillator Technologies
- Advanced Data Storage Technologies
- advanced mathematical theories
- Muon and positron interactions and applications
- Particle Accelerators and Free-Electron Lasers
- Medical Imaging Techniques and Applications
- Scientific Computing and Data Management
- Particle accelerators and beam dynamics
- Nuclear reactor physics and engineering
- Parallel Computing and Optimization Techniques
- Cold Fusion and Nuclear Reactions
- Quantum Computing Algorithms and Architecture
- Advanced Data Compression Techniques
- SARS-CoV-2 detection and testing
University of Copenhagen
2014-2023
The University of Adelaide
2013-2020
Northern Illinois University
2016-2020
National Institute for Research and Development of Isotopic and Molecular Technologies
2019-2020
Lund University
2020
Max Planck Institute for Physics
2020
University of Belgrade
2014-2017
Universidad de Granada
2014-2016
Universitatea Națională de Știință și Tehnologie Politehnica București
2015-2016
LIP - Laboratory of Instrumentation and Experimental Particle Physics
2014-2016
We present the FP420 R&D project, which has been studying key aspects of development and installation a silicon tracker fast-timing detectors in LHC tunnel at 420 m from interaction points ATLAS CMS experiments. These would measure precisely very forward protons conjunction with corresponding central as means to study Standard Model (SM) physics, search for characterise new physics signals. This report includes detailed description case detector and, particular, measurement Central Exclusive...
This is the summary and introduction to proceedings contributions for Les Houches 2009 "Tools Monte Carlo" working group.
We make the connection between certain deep learning architectures and renormalisation group explicit in context of QCD by using a network to construct toy parton shower model. The model aims describe proton-proton collisions at Large Hadron Collider. A convolutional autoencoder learns set kernels that efficiently encode behaviour fully showered collision events. is structured recursively so as ensure self-similarity, number trained parameters low. Randomness introduced via novel custom...
Light $\mathrm{CP}$-violating Higgs bosons with a mass lower than 70 GeV might have escaped detection in direct searches at the CERN LEP collider. They may remain undetected conventional search channels Fermilab Tevatron and LHC. In this paper we show that exclusive diffractive reactions be able to probe for existence of these otherwise elusive particles. As prototype example, calculate production cross sections lightest boson within framework minimal supersymmetric standard model explicit...
Collision experiments at the Large Hadron Collider suffer from problem of pile-up, which is read-out multiple simultaneous background proton-proton collisions per beam-crossing. We introduce a pile-up mitigation technique based on wavelet decomposition. Pile-up treated as form white noise, can be removed by filtering beam-crossing events in domain. The particle-level performance method evaluated using sample simulated collision that contain Z bosons decaying to pair neutrinos, overlaid with...
Wavelet decomposition is a method that has been applied to signal processing in wide range of subjects. The isolates small scale features from large features, while also maintaining information about where the those occur. Wavelets obey particular scaling relations and are especially suited analysis systems self-similar invariant. They therefore natural tool for study hadron collisions. This paper introduces use wavelets de-noising (removal soft activity), studying behaviour shower,...
The CEDAR collaboration is extending and combining the JetWeb HepData systems to provide a single service for tuning validating models of high-energy physics processes. centrepiece this activity fitting by observables computed from Monte Carlo event generator events against their experimentally determined distributions, as stored in HepData. Caching results simulation comparison stages provides cumulative database tunings, fitted wide range experimental quantities. An important feature...
This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The is written C++ using Python scripts for job configuration. that provide four-vectors describing results of LHC collisions are general by third parties and not part Athena. These libraries linked from LCG Generator Services (GENSER) distribution. Generators run within Athena generated event output put into a transient store, HepMC format, StoreGate. A common interface, implemented...
Abstract We present the HepMC3 library designed to perform manipulations with event records of High Energy Physics Monte Carlo Event Generators (MCEGs). The is a natural successor HepMC and HepMC2 libraries used in past. supports all functionality previous versions significantly extends them. In comparison versions, default record has been simplified, while an option add arbitrary information implemented. Particles vertices are stored separately ordered graph structure, reflecting evolution...
The Vulcan Nd:glass laser at the Central Laser Facility (CLF) has recently been upgraded to Petawatt level (10<sup>15</sup> Watts). three year upgrade project was contracted deliver 500 J in a near diffraction limited pulse of fs duration. facility will an irradiance on target 10<sup>21</sup> W•cm<sup>-2</sup> for wide ranging experimental program fundamental physics and advanced applications. This includes interaction super-high intensity light with matter, fast ignition fusion...
The CEDAR collaboration has developed and continues to develop a set of software tools for High Energy Physics phenomenology. We outline the status three core projects: HepData, database experimental measurements; JetWeb, validation tool Monte Carlo Rivet, analysing Carlos in highly reproducible way.
Hadronic final states in hadron-hadron collisions are often studied by clustering state hadrons into jets, each jet approximately corresponding to a hard parton.The typical size high energy hadron collision is between 0.4 and 1.0 eta-phi.On the other hand, there may be structures of interest an event that different scale size.For example, first approximation underlying produces uniform emission radiation spanning entire detector, colour connection effects partons fill region proton remnant...
Hadronic final states in hadron-hadron collisions are often studied by clustering state hadrons into jets, each jet approximately corresponding to a hard parton. The typical size high energy hadron collision is between 0.4 and 1.0 eta-phi. On the other hand, there may be structures of interest an event that different scale size. For example, first approximation underlying uniform emission radiation spanning entire detector, colour connection effects partons fill region proton remnant...
Views Icon Article contents Figures & tables Video Audio Supplementary Data Peer Review Share Twitter Facebook Reddit LinkedIn Tools Reprints and Permissions Cite Search Site Citation James Monk, ATLAS Collaboration; Forward Diffractive Physics with Early data. AIP Conf. Proc. 23 March 2009; 1105 (1): 82–85. https://doi.org/10.1063/1.3122232 Download citation file: Ris (Zotero) Reference Manager EasyBib Bookends Mendeley Papers EndNote RefWorks BibTex toolbar search Dropdown Menu input auto...