Roberto Divià

ORCID: 0000-0002-6357-7857
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • High-Energy Particle Collisions Research
  • Particle physics theoretical and experimental studies
  • Quantum Chromodynamics and Particle Interactions
  • Particle Detector Development and Performance
  • Nuclear reactor physics and engineering
  • Advanced Data Storage Technologies
  • Distributed and Parallel Computing Systems
  • Dark Matter and Cosmic Phenomena
  • Superconducting Materials and Applications
  • Cosmology and Gravitation Theories
  • Pulsars and Gravitational Waves Research
  • Particle Accelerators and Free-Electron Lasers
  • Radiation Detection and Scintillator Technologies
  • Statistical Methods and Bayesian Inference
  • Computational Physics and Python Applications
  • Nuclear physics research studies
  • Stochastic processes and statistical mechanics
  • Scientific Computing and Data Management
  • Atomic and Molecular Physics
  • Solar and Space Plasma Dynamics
  • Radiation Effects in Electronics
  • Black Holes and Theoretical Physics
  • VLSI and Analog Circuit Testing
  • Muon and positron interactions and applications
  • Theoretical and Computational Physics

European Organization for Nuclear Research
2015-2025

A. Alikhanyan National Laboratory
2019-2024

Universidad Nacional Autónoma de México
2022-2024

Istituto Nazionale di Fisica Nucleare, Sezione di Torino
2019

Results are presented on hyperon and antihyperon production in Pb–Pb, pPb pBe collisions at 158 GeV/c per nucleon. Λ, Ξ Ω yields have been measured central rapidity medium transverse momentum as functions of the centrality collision. Comparing Pb–Pb to those interactions, strangeness enhancement is observed. The increases with content hyperons, reaching a factor about 20 for collisions.

10.1088/0954-3899/32/4/003 article EN Journal of Physics G Nuclear and Particle Physics 2006-02-24

In this paper we describe the design, construction, commissioning and operation of Data Acquisition (DAQ) Experiment Control Systems (ECS) ALICE experiment at CERN Large Hadron Collider (LHC). The DAQ ECS are systems used respectively for acquisition all physics data overall control experiment. They two computing made hundreds PCs storage units interconnected via networks. collection experimental from detectors is performed by several high-speed optical links. We in detail design...

10.1016/j.nima.2013.12.015 article EN cc-by Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment 2013-12-17

ALICE (A Large Ion Collider Experiment) has undertaken a major upgrade during the LHC Long Shutdown 2. The increase in detector data rates led to hundredfold input raw data, up 3.5 TB/s. To cope with it, new common Online and Offline computing system, called O 2 , been developed put production. /FLP (First Level Processor) successor of DAQ implements critical functions readout, quality control operational services running CR1 centre at experimental site. Data from 15 subdetectors are read...

10.1051/epjconf/202429502029 article EN cc-by EPJ Web of Conferences 2024-01-01

ALICE (A Large Ion Collider Experiment) is the detector system at LHC (Large Hadron Collider) that studies behaviour of strongly interacting matter and quark gluon plasma. The information sent by sub-detectors composing are read out DATE (Data Acquisition Test Environment), data acquisition software, using hundreds multi-mode optical links called DDL (Detector Data Link). To cope with higher luminosity LHC, bandwidth will be upgraded in 2015. This paper describe evolution protocol from 2 to 6 Gbit/s.

10.1088/1748-0221/10/04/c04008 article EN cc-by Journal of Instrumentation 2015-04-10

All major experiments need tools that provide a way to keep record of the events and activities, both during commissioning operations. In ALICE (A Large Ion Collider Experiment) at CERN, this task is performed by Alice Electronic Logbook (eLogbook), custom-made application developed maintained Data-Acquisition group (DAQ). Started as statistics repository, eLogbook has evolved become not only fully functional electronic logbook, but also massive information repository used store conditions...

10.1088/1742-6596/219/2/022027 article EN Journal of Physics Conference Series 2010-04-01

The scalable coherent interface (SCI) is an IEEE proposed standard (P1596) for interconnecting multiprocessor systems. defines point-to-point connections between nodes, which can be processors, memories, or I/O devices. Networks containing a maximum of 64 K nodes with bandwidth 1 Gbyte/s constructed. SCI attractive candidate to serve as backbone high-speed, large-volume data acquisition systems such those required by future experiments at the Large Hadron Collider (LHC) CERN. First results...

10.1109/23.277466 article EN IEEE Transactions on Nuclear Science 1992-04-01

ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study physics of strongly interacting matter and quark-gluon plasma at CERN Hadron (LHC). Some specific calibration tasks are performed regularly for each 18 sub-detectors in order achieve most accurate measurements. These procedures involve events analysis a wide range experimental conditions, implicating various trigger types, data throughputs, electronics settings, algorithms, both during short sub-detector...

10.1088/1742-6596/219/2/022004 article EN Journal of Physics Conference Series 2010-04-01

The ALICE (A Large Ion Collider Experiment) Data Acquisition (DAQ) system has the unprecedented requirement to ensure a very high volume, sustained data stream between Detector and Permanent Storage (PDS) which is used as main repository for Event processing Offline Computing. key component accomplish this task Transient System (TDS), set of storage elements with its associated hardware software components, supports raw collection, conversion into format suitable subsequent high-level...

10.1088/1742-6596/219/5/052002 article EN Journal of Physics Conference Series 2010-04-01

ALICE (A Large Ion Collider Experiment) is the detector system at LHC (Large Hadron Collider) optimized for study of heavy-ion collisions interaction rates up to 50 kHz and data beyond 1 TB/s. Its main aim behavior strongly interacting matter quark gluon plasma. preparing a major upgrade starting from 2021, it will collect with several upgraded sub-detectors (TPC, ITS, Muon Tracker Chamber, TRD TOF). The DAQ read-out be as well, new link called GBT (GigaBit Transceiver) max. speed 4.48 Gb/s...

10.1109/rtc.2016.7543109 article EN 2016-06-01

ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study physics of strongly interacting matter and quark-gluon plasma at CERN Hadron (LHC). The online Data Quality Monitoring (DQM) a key element Acquisition's software chain. It provide shifters with precise complete information quickly identify overcome problems, as consequence ensure acquisition high quality data. DQM typically involves gathering, analysis by user-defined algorithms visualization monitored

10.1088/1742-6596/331/2/022030 article EN Journal of Physics Conference Series 2011-12-23

ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study physics of strongly interacting matter and Quark-Gluon Plasma at CERN Hadron (LHC). A large bandwidth flexible Data-Acquisition System (DAQ) has been deployed collect sufficient statistics in short running time available per year for heavy ions accommodate very different requirements originating from 18 sub-detectors. After several months data taking with beam, lots experience accumulated some important...

10.1088/1742-6596/331/2/022028 article EN Journal of Physics Conference Series 2011-12-23

The PCI-based Readout Receiver Card (PRORC) is the primary interface between detector data link (an optical device called DDL) and front-end computers (PC running Linux) of ALICE acquisition system. This document describes prototype architecture PRORC hardware firmware, PC software. board contains a PCI circuit an FPGA. firmware in FPGA responsible for all concurrent activities board, such as reading DDL controlling DMA. co-operation software allows autonomous transfer into memory with...

10.5170/cern-2002-003.281 article EN 2001-10-18

The research project RD24 is studying applications of the scalable coherent interface (IEEE-1596) standard for Large Hadron Collider (LHC). First SCI node chips from Dolphin were used to demonstrate use and functioning SCI's packet protocols measure data rates. We present results a first, two-node ringlet at CERN, based on R3000 RISC processor DMA MC68040 bus. A diagnostic link analyzer monitors up full bandwidth. In its second phase, will build first implementation multi-ringlet merger.<...

10.1109/23.281471 article EN IEEE Transactions on Nuclear Science 1994-02-01

ALICE (A Large Ion Collider Experiment) [1] is the heavy-ion experiment being prepared for Hadron (LHC) at CERN. Running implies performing a set of activities on several particle detectors. In these are grouped into four domains: Detector Control System (DCS), Data Acquisition (DAQ), Trigger (TRG) and High Level (HLT). The Experiment (ECS) top control level experiment. October 2004 ECS has been used to contro l combined beam tests Inner Tracking (ITS) with an experimental setup made three...

10.5170/cern-2005-002.155 article EN 2005-01-01

ALICE (A Large Ion Collider Experiment) is the heavy-ion detector studying physics of strongly interacting matter and quark-gluon plasma at CERN LHC (Large Hadron Collider). The DAQ (Data Acquisition System) facilities handle data flow from detectors electronics up to mass storage. system based on a large farm commodity hardware consisting more than 600 devices (Linux PCs, storage, network switches), controls hundreds distributed software components together. This paper presents Orthos,...

10.1088/1742-6596/396/1/012013 article EN Journal of Physics Conference Series 2012-12-13

ALICE [1] (A Large Ion Collider Experiment) is the detector system at LHC (Large Hadron Collider) optimized for study of heavy-ion collisions. Its main aim to behavior strongly interacting matter and quark gluon plasma. Currently all information sent by 18 sub-detectors composing are read out DATE [2] (Data Acquisition Test Environment), data acquisition software, using several optical links called DDL[3] (Detector Data Link), each one with a maximum throughput 200 MB/s. In last year...

10.1016/j.phpro.2012.02.518 article EN Physics Procedia 2012-01-01

The multiplicity of charged particles in the central rapidity region has been measured by NA57 experiment Pb--Pb collisions at CERN SPS two beam momenta: 158 A GeV/{\it c} and 40 c}. value $dN_{ch}/d\eta$ maximum determined its behaviour as a function centrality studied range covered (about 50% inelastic cross section). increases approximately logarithmically with centre mass energy.

10.1088/1742-6596/5/1/006 article EN Journal of Physics Conference Series 2005-01-01

ALICE (A Large Ion Collider Experiment) is the heavy-ion experiment designed to study physics of strongly interacting matter and quark-gluon plasma at CERN Hadron (LHC). Since its successful start-up in 2010, LHC has been performing outstandingly, providing experiments long periods stable collisions an integrated luminosity that greatly exceeds planned targets. To fully explore these privileged conditions, it paramount experiment's data taking efficiency during as high possible. In ALICE,...

10.1109/rtc.2012.6418393 article EN 2012-06-01
Coming Soon ...