- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Particle Detector Development and Performance
- Quantum Chromodynamics and Particle Interactions
- Dark Matter and Cosmic Phenomena
- Computational Physics and Python Applications
- Cosmology and Gravitation Theories
- Neutrino Physics Research
- Distributed and Parallel Computing Systems
- Scientific Computing and Data Management
- Research Data Management Practices
- Radiation Detection and Scintillator Technologies
- Advanced Data Storage Technologies
- Medical Imaging Techniques and Applications
- Black Holes and Theoretical Physics
- Atomic and Subatomic Physics Research
- Big Data Technologies and Applications
- Astrophysics and Cosmic Phenomena
- Generative Adversarial Networks and Image Synthesis
- Explainable Artificial Intelligence (XAI)
- Particle Accelerators and Free-Electron Lasers
- Model Reduction and Neural Networks
- Gaussian Processes and Bayesian Inference
- Noncommutative and Quantum Gravity Theories
- Digital Radiography and Breast Imaging
University of Illinois Urbana-Champaign
2009-2025
University of Oxford
2023-2025
University of Illinois System
2024-2025
Institute of Particle Physics
2019-2024
Université de Montréal
2019-2024
The University of Texas at Austin
2019-2024
Indian Institute of Technology Indore
2022-2024
The University of Adelaide
2019-2023
National Center for Supercomputing Applications
2022-2023
University of Geneva
2023
A foundational set of findable, accessible, interoperable, and reusable (FAIR) principles were proposed in 2016 as prerequisites for proper data management stewardship, with the goal enabling reusability scholarly data. The also meant to apply other digital assets, at a high level, over time, FAIR guiding have been re-interpreted or extended include software, tools, algorithms, workflows that produce are now being adapted context AI models datasets. Here, we present perspectives, vision,...
Physics-informed neural networks (PINNs) have been shown to be effective in solving partial differential equations by capturing the physics induced constraints as a part of training loss function. This paper shows that PINN can sensitive errors data and overfit itself dynamically propagating these over domain solution PDE. It also how physical regularizations based on continuity criteria conservation laws fail address this issue rather introduce problems their own causing deep network...
Current methods commonly used for uncertainty quantification (UQ) in deep learning (DL) models utilize Bayesian which are computationally expensive and time-consuming. In this paper, we provide a detailed study of UQ based on evidential (EDL) neural network designed to identify jets high energy proton-proton collisions at the Large Hadron Collider explore its utility anomaly detection. EDL is DL approach that treats as an evidence acquisition process confidence (or epistemic uncertainty)...
Recent developments in the methods of explainable AI (XAI) allow researchers to explore inner workings deep neural networks (DNNs), revealing crucial information about input-output relationships and realizing how data connects with machine learning models. In this paper we interpretability DNN models designed identify jets coming from top quark decay high energy proton-proton collisions at Large Hadron Collider (LHC). We review a subset existing tagger different quantitative which features...
Vector-like Quarks (VLQs) are potential signatures of physics beyond the Standard Model at TeV energy scale and major efforts have been put forward both ATLAS CMS experiments in search these particles. In order to make results more relatable context most plausible theories VLQs, it is deemed important present analysis a general fashion. We investigate challenges associated with such interpretations singly produced VLQ searches propose generalized, semi-analytical framework that allows...
Abstract The findable, accessible, interoperable, and reusable (FAIR) data principles provide a framework for examining, evaluating, improving how is shared to facilitate scientific discovery. Generalizing these research software other digital products an active area of research. Machine learning models—algorithms that have been trained on without being explicitly programmed—and more generally, artificial intelligence (AI) models, are important target this because the ever-increasing pace...
With the improving energy resolution of transitionedge sensor (TES) based microcalorimeters, performance verification and calibration these detectors has become increasingly challenging, especially in range below 1 keV where fluorescent atomic X-ray lines have linewidths that are wider than detector require impractically high statistics to determine gain deconvolve instrumental profile. Better behaved sources such as grating monochromators too cumbersome for space missions difficult use lab....
We consider the low-energy electronic properties of graphene cones in presence a global Fries-Kekulé Peierls distortion. Such occur fullerenes as geometric response to disclination associated with pentagon rings. It is well known that long-range effect deficit-angle can be modelled continuum Dirac-equation approximation by spin connection and non-abelian gauge field. show here understand bound states localized vicinity pair pentagons one must, addition topological effects curvature flux,...
The phenomenon of avalanche-gain variations over time, particularly in Micro Pattern Gaseous Detectors (MPGD) incorporating insulator materials, have been generally attributed to electric-field modifications resulting from "charging-up" effects the insulator. A robust methodology for characterization gain-transients such detectors is presented. It comprises three guidelines: detector initialization, long-gain stabilization monitoring and imposing transients by applying abrupt changes...
We report on recent advances in the operation of bubble-assisted Liquid Hole Multipliers (LHM). By confining a vapor bubble under or adjacent to perforated electrode immersed liquid xenon, we could record both radiation-induced ionization electrons and primary scintillation photons noble liquid. Four types LHM electrodes were investigated: THGEM, standard double-conical GEM, 50 $\mu$m-thick single-conical GEM (SC-GEM) 125 SC-GEM - all coated with CsI photocathodes. The provided highest...
Neural Networks are ubiquitous in high energy physics research. However, these highly nonlinear parameterized functions treated as \textit{black boxes}- whose inner workings to convey information and build the desired input-output relationship often intractable. Explainable AI (xAI) methods can be useful determining a neural model's with data toward making it \textit{interpretable} by establishing quantitative tractable between input output. In this letter of interest, we explore potential...
The search for Top and Bottom Partners is a major focus of analyses at both the ATLAS CMS experiments. When singly produced, these vector-like partners Standard Model third generation quarks retain sizeable cross-section that makes them attractive candidates in their respective topologies. While most efforts have concentrated on resonant mode single production hypothetical particles, dominant narrow widths, recent studies revealed wide rich phenomenology involving non-resonant diagrams. In...
We investigate quantum correlations between successive steps of black hole evaporation and whether they might resolve the information paradox. 'Small' corrections in various models were shown to be unable restore unitarity. study a toy qubit model that allows small reaffirm previous results. Then, we relax 'smallness' condition find nontrivial upper lower bound on entanglement entropy change during process. This gives quantitative measure size correction needed these bounds lead significant...
Multivariate techniques and machine learning models have found numerous applications in High Energy Physics (HEP) research over many years. In recent times, AI based on deep neural networks are becoming increasingly popular for of these applications. However, regarded as black boxes- because their high degree complexity it is often quite difficult to quantitatively explain the output a network by establishing tractable input-output relationship information propagation through layers. As...
The findable, accessible, interoperable, and reusable (FAIR) data principles serve as a framework for examining, evaluating, improving sharing to advance scientific endeavors. There is an emerging trend adapt these machine learning models—algorithms that learn from without specific coding—and, more generally, AI models, due AI’s swiftly growing impact on engineering sectors. In this paper, we propose practical definition of the FAIR models provide template program their adoption. We...
Research in the data-intensive discipline of high energy physics (HEP) often relies on domain-specific digital contents. Reproducibility research proper preservation these objects. This paper reflects interpretation principles Findability, Accessibility, Interoperability, and Reusability (FAIR) such context demonstrates its implementation by describing development an end-to-end support infrastructure for preserving accessing Universal FeynRules Output (UFO) models guided FAIR principles. UFO...
Research in the data-intensive discipline of high energy physics (HEP) often relies on domain-specific digital contents. Reproducibility research proper preservation these objects. This paper reflects interpretation principles Findability, Accessibility, Interoperability, and Reusability (FAIR) such context demonstrates its implementation by describing development an end-to-end support infrastructure for preserving accessing Universal FeynRules Output (UFO) models guided FAIR principles. UFO...
Parikh and Wilczek formulated Hawking radiation as quantum tunneling across the event horizon proving spectrum to be nonthermal. These nonthermality factors emerging due back reaction effects have been claimed responsible for correlations among emitted quanta. It has proposed by several authors in literature that these actually carry out information locked a black hole hence provide resolution long debated paradox. This paper demonstrates this is fallacious proposition. Finally, it...
Hawking's argument about non-unitary evolution of black holes is often questioned on the ground that it doesn't acknowledge quantum correlations in radiation process. However, recently has been shown adding `small' correction to leading order Hawking analysis, accounting for correlations, help restore unitarity. This paper generalizes bound entanglement entropy by relaxing `smallness' condition and configures parameters possible recovery information from an evaporating hole. The new...
n recent years, digital object management practices to support findability, accessibility, interoper- ability, and reusability (FAIR) have begun be adopted across a number of data-intensive scientific disciplines. These objects include datasets, AI models, software, notebooks, workflows, documentation, etc. With the collective dataset at Large Hadron Collider scheduled reach zettabyte scale by end 2032, experimental particle physics community is looking unprecedented data challenges. It...