C. Fitzpatrick
- Particle physics theoretical and experimental studies
- Quantum Chromodynamics and Particle Interactions
- High-Energy Particle Collisions Research
- Neutrino Physics Research
- Particle Detector Development and Performance
- Dark Matter and Cosmic Phenomena
- Black Holes and Theoretical Physics
- Computational Physics and Python Applications
- Medical Imaging Techniques and Applications
- Particle Accelerators and Free-Electron Lasers
- Superconducting Materials and Applications
- Atomic and Subatomic Physics Research
- Distributed and Parallel Computing Systems
- Nuclear physics research studies
- Stochastic processes and statistical mechanics
- Cosmology and Gravitation Theories
- Radiation Detection and Scintillator Technologies
- International Science and Diplomacy
- Markov Chains and Monte Carlo Methods
- Advanced NMR Techniques and Applications
- Astrophysics and Cosmic Phenomena
- Cold Atom Physics and Bose-Einstein Condensates
- Pulsars and Gravitational Waves Research
- advanced mathematical theories
- Stochastic processes and financial applications
University of Manchester
2020-2025
Texas A&M University
2025
University of Oxford
2023
European Organization for Nuclear Research
2010-2021
AGH University of Krakow
2014-2021
École Polytechnique Fédérale de Lausanne
2014-2020
University of Edinburgh
2010-2020
University of Chinese Academy of Sciences
2020
Peking University
2020
State Key Laboratory of Nuclear Physics and Technology
2020
The LHCb experiment has been taking data at the Large Hadron Collider (LHC) CERN since end of 2009. One its key detector components is Ring-Imaging Cherenkov (RICH) system. This provides charged particle identification over a wide momentum range, from 2–100 GeV/c. operation and control, software, online monitoring RICH system are described. performance presented, as measured using LHC. Excellent separation hadronic types (π, K, p) achieved.
Machine learning is an important applied research area in particle physics, beginning with applications to high-level physics analysis the 1990s and 2000s, followed by explosion of event identification reconstruction 2010s. In this document we discuss promising future development areas machine a roadmap for their implementation, software hardware resource requirements, collaborative initiatives data science community, academia industry, training community science. The main objective connect...
Motivated by the success of flavour physics programme carried out over last decade at Large Hadron Collider (LHC), we characterize in detail potential its High-Luminosity and High-Energy upgrades this domain physics. We document extraordinary breadth HL/HE-LHC enabled a putative Upgrade II dedicated experiment LHCb evolution established role ATLAS CMS general purpose experiments. connect to studies top quark, Higgs boson, direct high-$p_T$ searches for new particles force carriers. discuss...
Particle physics has an ambitious and broad experimental programme for the coming decades. This requires large investments in detector hardware, either to build new facilities experiments, or upgrade existing ones. Similarly, it commensurate investment R&D of software acquire, manage, process, analyse shear amounts data be recorded. In planning HL-LHC particular, is critical that all collaborating stakeholders agree on goals priorities, efforts complement each other. this spirit, white paper...
The LHCb collaboration has redesigned its trigger to enable the full offline detector reconstruction be performed in real time. Together with real-time alignment and calibration of detector, a software infrastructure make persistent high-level physics objects produced during processing, this redesign enabled widespread deployment analysis Run 2. We describe design 2 reconstruction, present data-driven performance measurements for representative sample LHCb's programme.
We hypothesized that patients who underwent cesarean delivery and received oxytocin boluses followed by an infusion would have a lower incidence of secondary uterotonic administration compared to had without boluses. Patients deliveries at our hospital from September 1, 2021 through December 31, 2022 2022, corresponding the bolus cohorts, respectively, were included. Patient demographic, physical, clinical characteristic data collected study investigator. Intramyometrial oxytocin,...
Background: Replication stress is a feature of the hyper-proliferation experienced by many cancers, leading to genome instability and accumulation potentially lethal DNA damage. During their evolution, cancer cells inactivate specific DNA-damage response pathways, placing high dependence on pathways that remain, one which regulated PKMYT1. This protein kinase prevents unscheduled G2 M transition allow time needed for repair before division. In rely PKMYT1, its inhibition forces divide when...
Machine learning has been applied to several problems in particle physics research, beginning with applications high-level analysis the 1990s and 2000s, followed by an explosion of event identification reconstruction 2010s. In this document we discuss promising future research development areas for machine physics. We detail a roadmap their implementation, software hardware resource requirements, collaborative initiatives data science community, academia industry, training community science....
An evolved real-time data processing strategy is proposed for high-energy physics experiments, and its implementation at the LHCb experiment presented. The reduced event model allows not only signal candidate firing trigger to be persisted, as previously available, but also an arbitrary set of other reconstructed or raw objects from event. This higher rates a given output bandwidth, when compared traditional saving full detector each trigger, whilst accommodating inclusive triggers...
The LHCb experiment at CERN is undergoing an upgrade in preparation for the Run 3 data taking period of LHC. As part this trigger moving to a fully software implementation operating LHC bunch crossing rate. We present evaluation CPU-based and GPU-based first stage High Level Trigger. After detailed comparison both options are found be viable. This document summarizes performance details these options, outcome which has led choice as baseline.
The LHCb experiment will operate at a luminosity of 2 × 1033 cm−2s−1 during LHC Run 3. At this rate the present readout and hardware Level-0 trigger become limitation, especially for fully hadronic final states. In order to maintain high signal efficiency upgraded detector deploy two novel concepts: triggerless full software trigger.
CF 4 is used as a Cherenkov gas radiator in one of the Ring Imaging detectors at LHCb experiment CERN Large Hadron Collider.CF well known to have high scintillation photon yield near and far VUV, UV visible wavelength range.A large flux photons our detection acceptance between 200 800 nm could compromise particle identification efficiency.We will show that this emission system can be effectively quenched, consistent with radiationless transitions, no significant impact on resulting from radiation.
Common and community software packages, such as ROOT, Geant4 event generators have been a key part of the LHC's success so far continued development optimisation will be critical in future. The challenges are driven by an ambitious physics programme, notably LHC accelerator upgrade to high-luminosity, HL-LHC, corresponding detector upgrades ATLAS CMS. In this document we address issues for that is used multiple experiments (usually even more widely than CMS) maintained teams developers who...
The LHCb Performance Regression (LHCbPR) framework allows for periodic software testing to be performed in a reproducible manner. LHCbPR provides JavaScript based web front-end service, built atop industry standard tools such as AngularJS, Bootstrap and Django. This records the evolution of tests over time allowing this data extracted end-user analysis. has been expanded integrate nightly profiling. These developments allow key performance metrics within Trigger monitored time. Additionally,...
The upgraded LHCb detector, due to start datataking in 2022, will have process an average data rate of 4 TB/s real time. Because LHCb’s physics objectives require that the full detector information for every LHC bunch crossing is read out and made available real-time processing, this bandwidth challenge equivalent ATLAS CMS HL-LHC software read-out, but deliverable five years earlier. Over past six years, collaboration has undertaken a bottom-up rewrite its infrastructure, pattern...
Realizing the physics programs of planned and upgraded high-energy (HEP) experiments over next 10 years will require HEP community to address a number challenges in area software computing. For this reason, has engaged planning process past two years, with objective identifying prioritizing research development required enable generation detectors fulfill their full potential. The aim is produce Community White Paper which describe strategy roadmap for computing 2020s. topics event...
Since his death in 1966, Delmore Schwartz has come to be seen as a figure of failure, the first and most readily destroyed second American ‘lost generation’. This article argues that, on contrary, important sections Schwartz's work succeed creating ‘poetry failure’ which mimes collapse attempt conjure beauty into existence with words. That these honest attentive poetic accounts unremitting difficulty have been overlooked by critics preoccupied ‘strength’ or ‘acclaim’ only serves call...