- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Particle Detector Development and Performance
- Quantum Chromodynamics and Particle Interactions
- Dark Matter and Cosmic Phenomena
- Computational Physics and Python Applications
- Cosmology and Gravitation Theories
- Neutrino Physics Research
- Distributed and Parallel Computing Systems
- Gaussian Processes and Bayesian Inference
- Scientific Computing and Data Management
- Black Holes and Theoretical Physics
- Astrophysics and Cosmic Phenomena
- Advanced Data Storage Technologies
- Superconducting Materials and Applications
- Medical Imaging Techniques and Applications
- Radiation Detection and Scintillator Technologies
- Bayesian Methods and Mixture Models
- Particle Accelerators and Free-Electron Lasers
- Generative Adversarial Networks and Image Synthesis
- Algorithms and Data Compression
- Big Data Technologies and Applications
- Neural Networks and Applications
- Statistical Methods and Inference
- Geophysics and Gravity Measurements
University of Wisconsin–Madison
2001-2025
New York University
2015-2024
European Organization for Nuclear Research
2005-2024
Federación Española de Enfermedades Raras
2024
AGH University of Krakow
2012-2024
University of Toronto
2019-2024
Jagiellonian University
2017-2024
Fermi National Accelerator Laboratory
2024
University of Chicago
2024
Atlas Scientific (United States)
2024
In October 2018 an APS Physics Next Workshop on Machine Learning was held in Riverhead, NY. This article reviews and summarizes the proceedings of this very broad, emerging field.This needs to be a placard left-hand column, with custom tag.
We describe likelihood-based statistical tests for use in high energy physics the discovery of new phenomena and construction confidence intervals on model parameters. focus properties test procedures that allow one to account systematic uncertainties. Explicit formulae asymptotic distributions statistics are derived using results Wilks Wald. motivate justify a representative data set, called "Asimov set", which provides simple method obtain median experimental sensitivity search or...
Many domains of science have developed complex simulations to describe phenomena interest. While these provide high-fidelity models, they are poorly suited for inference and lead challenging inverse problems. We review the rapidly developing field simulation-based identify forces giving additional momentum field. Finally, we how frontier is expanding so that a broad audience can appreciate profound influence developments may on science.
The summarizes much of particle physics and cosmology. Using data from previous editions, plus 2,717 new measurements 869 papers, we list, evaluate, average measured properties gauge bosons the recently discovered Higgs boson, leptons, quarks, mesons, baryons. We summarize searches for hypothetical particles such as supersymmetric particles, heavy bosons, axions, dark photons, etc. Particle search limits are listed in Summary Tables. give numerous tables, figures, formulae, reviews topics...
We present a search at the Jefferson Laboratory for new forces mediated by sub-GeV vector bosons with weak coupling ${\ensuremath{\alpha}}^{\ensuremath{'}}$ to electrons. Such particle ${A}^{\ensuremath{'}}$ can be produced in electron-nucleus fixed-target scattering and then decay an ${e}^{+}{e}^{\ensuremath{-}}$ pair, producing narrow resonance QED trident spectrum. Using APEX test run data, we searched mass range 175--250 MeV, found no evidence...
We investigate a new structure for machine learning classifiers built with neural networks and applied to problems in high-energy physics by expanding the inputs include not only measured features but also parameters. The parameters represent smoothly varying task, resulting parameterized classifier can interpolate between them replace sets of trained at individual values. This simplifies training process gives improved performance intermediate values, even complex requiring deep learning....
We develop a general approach to distill symbolic representations of learned deep model by introducing strong inductive biases. focus on Graph Neural Networks (GNNs). The technique works as follows: we first encourage sparse latent when train GNN in supervised setting, then apply regression components the extract explicit physical relations. find correct known equations, including force laws and Hamiltonians, can be extracted from neural network. our method non-trivial cosmology example-a...
A bstract Recent progress in applying machine learning for jet physics has been built upon an analogy between calorimeters and images. In this work, we present a novel class of recursive neural networks instead QCD natural languages. the analogy, four-momenta are like words clustering history sequential recombination algorithms is parsing sentence. Our approach works directly with variable-length set particles, jet-based tree structure varies on event-by-event basis. experiments highlight...
Machine learning has played an important role in the analysis of high-energy physics data for decades. The emergence deep 2012 allowed machine tools which could adeptly handle higher-dimensional and more complex problems than previously feasible. This review is aimed at reader who familiar with high energy but not learning. connections between are explored, followed by introduction to core concepts neural networks, examples key results demonstrating power LHC data, discussion future...
We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge-invariant by construction. demonstrate the application this framework to U(1) theory in two spacetime dimensions, and find near critical points parameter space approach is orders magnitude more efficient at topological quantities than traditional procedures such as Hybrid Monte Carlo Heat Bath.
Simulators often provide the best description of real-world phenomena. However, they also lead to challenging inverse problems because density implicitly define is intractable. We present a new suite simulation-based inference techniques that go beyond traditional Approximate Bayesian Computation approach, which struggles in high-dimensional setting, and extend methods use surrogate models based on neural networks. show additional information, such as joint likelihood ratio score, can be...
Statistical analysis of High Energy Physics (HEP) data relies on quantifying the compatibility observed collision events with theoretical predictions.The relationship between them is often formalised in a statistical model f (x|ϕ) describing probability x given parameters ϕ.Given data, likelihood L(ϕ) then serves as basis for inference ϕ.For measurements based binned (histograms), HistFactory family models (Cranmer et al., 2012) has been widely used both Standard Model (ATLAS Collaboration,...
First-principle simulations are at the heart of high-energy physics research program. They link vast data output multi-purpose detectors with fundamental theory predictions and interpretation. This review illustrates a wide range applications modern machine learning to event generation simulation-based inference, including conceptional developments driven by specific requirements particle physics. New ideas tools developed interface will improve speed precision forward simulations, handle...
RooStats is a project to create advanced statistical tools required for the analysis of LHC data, with emphasis on discoveries, confidence intervals, and combined measurements.The idea provide major techniques as set C++ classes coherent interfaces, so that can be used arbitrary model datasets in common way.The are built top RooFit package, which provides functionality easily creating probability models, combinations digital publications results.We will present detail design implementation...
This report summarizes the work of Energy Frontier Higgs Boson working group 2013 Community Summer Study (Snowmass). We identify key elements a precision physics program and document potential future experimental facilities as elucidated during Snowmass study. study couplings to gauge boson fermion pairs, double production for self-coupling, its quantum numbers $CP$-mixing in couplings, mass total width, prospects direct searches additional bosons extensions Standard Model. Our includes...
In many fields of science, generalized likelihood ratio tests are established tools for statistical inference. At the same time, it has become increasingly common that a simulator (or generative model) is used to describe complex processes tie parameters $θ$ an underlying theory and measurement apparatus high-dimensional observations $\mathbf{x}\in \mathbb{R}^p$. However, often do not provide way evaluate function given observation $\mathbf{x}$, which motivates new class likelihood-free...
We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments. By harnessing the latent-space structure of particle physics processes, we extract extra information from simulator. This augmented data can be used train neural networks that precisely estimate likelihood ratio. The new methods scale well many observables high-dimensional parameter spaces, do not require any approximations parton shower detector response, evaluated...
We present the activities of `New Physics' working group for `Physics at TeV Colliders' workshop (Les Houches, France, 5--23 June, 2017). Our report includes new physics studies connected with Higgs boson and its properties, direct search strategies, reinterpretation LHC results in building viable models computational tool developments.
The solutions adopted by the high-energy physics community to foster reproducible research are examples of best practices that could be embraced more widely. This first experience suggests reproducibility requires going beyond openness.