- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Particle Detector Development and Performance
- Quantum Chromodynamics and Particle Interactions
- Dark Matter and Cosmic Phenomena
- Cosmology and Gravitation Theories
- Computational Physics and Python Applications
- Neutrino Physics Research
- Distributed and Parallel Computing Systems
- Black Holes and Theoretical Physics
- Radiation Detection and Scintillator Technologies
- Advanced Data Storage Technologies
- Astrophysics and Cosmic Phenomena
- Parallel Computing and Optimization Techniques
- Medical Imaging Techniques and Applications
- Scientific Computing and Data Management
- advanced mathematical theories
- Muon and positron interactions and applications
- Radiation Therapy and Dosimetry
- Radiation Effects in Electronics
- Big Data Technologies and Applications
- Algorithms and Data Compression
- Particle accelerators and beam dynamics
- Radiomics and Machine Learning in Medical Imaging
- Nuclear reactor physics and engineering
Université Paris-Sud
2009-2022
Centre National de la Recherche Scientifique
2010-2022
Institut National de Physique Nucléaire et de Physique des Particules
2010-2022
Laboratoire de Physique Corpusculaire
2006-2020
Lawrence Berkeley National Laboratory
2008-2020
Université Clermont Auvergne
2019
Université Paris-Saclay
2013-2017
The University of Adelaide
2013-2016
Université Paris Cité
2016
Ludwig-Maximilians-Universität München
2012-2016
We report on measurements of the neutron spin asymmetries An1,2 and polarized structure functions gn1,2 at three kinematics in deep inelastic region, with x=0.33, 0.47, 0.60 Q2=2.7, 3.5, 4.8(GeV∕c)2, respectively. These were performed using a 5.7GeV longitudinally electron beam 3He target. The results for An1 gn1 x=0.33 are consistent previous world data and, two higher-x points, have improved precision by about an order magnitude. new show zero crossing around x=0.47 value x=0.60 is...
We have measured the neutron spin asymmetry A(n)(1) with high precision at three kinematics in deep inelastic region x=0.33, 0.47, and 0.60, Q(2)=2.7, 3.5, 4.8 (GeV/c)(2), respectively. Our results unambiguously show, for first time, that crosses zero around x=0.47 becomes significantly positive x=0.60. Combined world proton data, polarized quark distributions were extracted. results, general, agree relativistic constituent models perturbative quantum chromodynamics (PQCD) analyses based on...
The SoLid experiment, short for Search Oscillations with a Lithium-6 detector, is new generation neutrino experiment which tries to address the key challenges high precision
The ATLAS detector has been designed for operation at the Large Hadron Collider CERN. includes electromagnetic and hadronic liquid argon calorimeters, with almost 200,000 channels of data that must be sampled LHC bunch crossing frequency 40 MHz. calorimeter electronics calibration readout are performed by custom developed specifically these purposes. This paper describes system performance electronics, including noise, energy time resolution, long term stability, taken mainly from...
We present the first measurement of Q2 dependence neutron spin structure function g2(n) at five kinematic points covering 0.57 (GeV/c)2 < or = 1.34 x approximately 0.2. Though naive quark-parton model predicts g2 0, nonzero values occur in more realistic models nucleon which include quark-gluon correlations, finite quark masses, orbital angular momentum. When scattering from a noninteracting quark, can be predicted using next-to-leading order fits to world data for g1(n). Deviations this...
After ten years from its first version, the Gaudi software framework underwent many changes and improvements with a subsequent increase of code base. Those were almost always introduced preserving backward compatibility reducing as much possible in itself; obsolete has been removed only rarely. release targeted to data taking 2008, it decided have review aim general consolidation view 2009. We also take occasion introduce those never implemented because big impact they on rest code, needed...
In 2004 at the ATLAS (A Toroidal LHC ApparatuS) combined test beam, one slice of barrel detector (including an Inner Detector set-up and Liquid Argon calorimeter) was exposed to particles from H8 SPS beam line CERN. It first occasion electron performance ATLAS. This paper presents results obtained for momentum measurement p with LAr calorimeter (energy E linearity resolution) in presence a magnetic field momenta ranging 20 GeV/c 100 GeV/c. Furthermore particle identification capabilities...
The SoLid experiment aims to measure neutrino oscillation at a baseline of 6.4 m from the BR2 nuclear reactor in Belgium. Anti-neutrinos interact via inverse beta decay (IBD), resulting positron and neutron signal that are correlated time space. detector operates surface building, with modest shielding, relies on extremely efficient online rejection backgrounds order identify these interactions. A novel design has been developed using 12800 5 cm cubes for high segmentation. Each cube is...
The SoLid experiment is a very-short-baseline aimed at searching for nuclear-reactor-produced active-to-sterile antineutrino oscillations. detection principle based on the pairing of two types solid scintillators: polyvinyl toluene and 6LiF:ZnS(Ag), which new technology used in this field Physics. In addition to good neutron-gamma discrimination, setup allows detector be highly segmented (the basic unit 5 cm side cube). High segmentation provides numerous advantages, including precise...
Computers are no longer getting faster: instead, they growing more and CPUs, each of which is faster than the previous generation. This increase in number cores evidently calls for parallelism HENP software. If end-users' stand-alone analysis applications relatively easy to modify, LHC experiments frameworks, being mostly written with a single 'thread' execution mind consequent code bases, on other hand challenging parallelize. Widespread inconsiderate changes so close data taking out...
The shared memory architecture of multicore CPUs provides HEP developers with the opportunity to reduce footprint their applications by sharing pages between cores in a processor. ATLAS pioneered multi-process approach parallelize applications. Using Linux fork() and Copy On Write mechanism we implemented simple event task farm, which allowed us achieve almost 80% among worker processes for certain types reconstruction jobs negligible CPU overhead. By leaving managing operating system, have...
The construction and first proton beam tests of a demonstrator dedicated to the ballistic control in hadrontherapy cancer treatments are described. This cost-effective demonstrator, called large area pixelized detector, is PET-like detector used for in-beam control. It was built test feasibility monitoring real time, during irradiation, ion range patient through measurement beam-induced β <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">+</sup>...
The Tile Calorimeter, which constitutes the central section of ATLAS hadronic calorimeter, is a non-compensating sampling device made iron and scintillating tiles. construction phase calorimeter nearly complete, most effort now directed toward final assembly commissioning in underground experimental hall. layout tasks carried out during are described, first with brief reminder requirements that drove design. During last few years comprehensive test-beam program has been followed order to...
The SoLid experiment has been designed to search for an oscillation pattern induced by a light sterile neutrino state, utilising the BR2 reactor of SCK⋅CEN, in Belgium. detector leverages new hybrid technology, two distinct scintillators cubic array, creating highly segmented volume. A combination 5 cm polyvinyltoluene cells, with 6LiF:ZnS(Ag) sheets on faces each cube, facilitate reconstruction signals. Whilst high granularity provides powerful toolset discriminate backgrounds; itself...
A critical component of any multicore/manycore application architecture is the handling input and output. Even in simplest models, design decisions interact both obvious subtle ways with persistence strategies. When multiple workers handle I/O independently using distinct instances a serial framework, for example, it may happen that because way data from consecutive events are compressed together, there be serious inefficiencies, redundantly reading same buffers, or thereof. With shared...
In modern High Energy Physics (HEP) experiments visualization of experimental data has a key role in many activities and tasks across the whole chain: from detector development to monitoring, event generation reconstruction physics objects, simulation analysis, all way outreach education. this paper, definition, status, evolution for HEP will be presented. Suggestions upgrade tools techniques current outlined, along with guidelines future experiments. This paper expands on summary content...
The reconstruction of photons in the ATLAS detector is studied with data taken during 2004 Combined Test Beam, where a full slice was exposed to beams particles known energy at CERN SPS. results presented show significant differences longitudinal development electromagnetic shower between converted and unconverted as well total measured energy. potential use reconstructed means precisely map material tracker front calorimeter also considered. All obtained are compared detailed Monte-Carlo...
Thermal limitations have forced CPU manufacturers to shift from simply increasing clock speeds improve processor performance, producing chip designs with multi- and many-core architectures. Further the cores themselves can run multiple threads as a zero overhead context switch allowing low level resource sharing (Intel Hyperthreading). To maximize bandwidth minimize memory latency, access has become non uniform (NUMA). As add more each chip, careful understanding of underlying architecture...
Current High Energy and Nuclear Physics (HENP) frameworks were written before multicore systems became widely deployed. A 'single-thread' execution model naturally emerged from that environment, however, this no longer fits into the processing on dawn of manycore era. Although previous work focused minimizing changes to be applied LHC (because data taking phase) while still trying reap benefits parallel-enhanced CPU architectures, paper explores what new languages could bring design...
In anticipation of data taking, ATLAS has undertaken a program work to develop an explicit state representation the experiment's complex transient event model. This effort provided both opportunity consider explicitly structure, organization, and content persistent store before writing tens petabytes (replacing simple streaming, which uses as core dump memory), locus for support model evolution, including significant refactoring, beyond automatic schema evolution capabilities underlying...