F. Meijers
- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Computational Physics and Python Applications
- Advanced Data Storage Technologies
- Dark Matter and Cosmic Phenomena
- Distributed and Parallel Computing Systems
- Cosmology and Gravitation Theories
- Neutrino Physics Research
- Parallel Computing and Optimization Techniques
- Medical Imaging Techniques and Applications
- Interconnection Networks and Systems
- Scientific Computing and Data Management
- Particle Accelerators and Free-Electron Lasers
- Astrophysics and Cosmic Phenomena
- Nuclear physics research studies
- Black Holes and Theoretical Physics
- Radiation Detection and Scintillator Technologies
- Gamma-ray bursts and supernovae
- Dutch Social and Cultural Studies
- Distributed systems and fault tolerance
- Educational Practices and Policies
- Superconducting Materials and Applications
- International Science and Diplomacy
European Organization for Nuclear Research
2016-2025
University of Antwerp
1987-2024
Institute of High Energy Physics
2013-2024
A. Alikhanyan National Laboratory
2022-2024
Cukurova University
2023
Fermi Research Alliance
2019
Massachusetts Institute of Technology
2019
National Technical University of Athens
2019
University of California, San Diego
2019
Vilnius University
2019
Abstract For the Phase-2 upgrade of CMS experiment, central DAQ group designed and developed two custom ATCA boards. These boards provide interfaces between sub-detector electronics systems. This paper describes our experience with chosen prototyping strategy, a focus on design modification choices made along way. It concludes brief overview recent firmware developments, look at transition towards full board production.
In The Netherlands, a growing number of vocational education and training institutes are implementing competence‐based approaches to learning, including new career guidance practices. These practices often involve instruments such as portfolios or personal development plans, aimed at supporting students in their search for sense direction, occupational choice developing identities. this study perceptions teachers, counsellors on plans were investigated two schools one prevocational school....
The CMS data acquisition system is made of two major subsystems: event building and filter. presented paper describes the architecture design software that processes flow in currently operating experiment. central DAQ relies on industry standard networks processing equipment. Adopting a single infrastructure all subsystems experiment imposes, however, number different requirements. High efficiency configuration flexibility are among most important ones. XDAQ has matured over an eight years...
The Compact Muon Solenoid (CMS) experiment operating at the CERN (European Laboratory for Nuclear Physics) Large Hadron Collider (LHC) is in process of upgrading several its detector systems. Adding more individual components brings need to test and commission those separately from existing ones so as not compromise physics data-taking. CMS Trigger, Timing Control (TTC) system had reached limits terms number separate elements (partitions) that could be supported. A new Distribution System...
The data-acquisition system of the CMS experiment at LHC performs read-out and assembly events accepted by first level hardware trigger. Assembled are made available to high-level trigger which selects interesting for offline storage analysis. is designed handle a maximum input rate 100 kHz an aggregated throughput 100GB/s originating from approximately 500 sources. An overview architecture design software DAQ given. We discuss performance operational experience months physics data taking.
For the upgrade of DAQ CMS experiment in 2013/2014 an interface between custom detector Front End Drivers (FEDs) and new eventbuilder network has to be designed. a loss-less data collection from more then 600 FEDs FPGA based card implementing TCP/IP protocol suite over 10Gbps Ethernet been developed. We present hardware challenges modifications made TCP order simplify its implementation together with set performance measurements which were carried out current prototype.
The data acquisition (DAQ) system of the CMS experiment at CERN Large Hadron Collider assembles events a rate 100 kHz, transporting event an aggregate throughput GB/s to high level trigger (HLT) farm. HLT farm selects interesting for storage and offline analysis around 1 kHz. DAQ has been redesigned during accelerator shutdown in 2013/14. motivation is twofold: Firstly, current compute nodes, networking, infrastructure will have reached end their lifetime by time LHC restarts. Secondly,...
The DAQ system of the CMS experiment at CERN collects data from more than 600 custom detector Front-End Drivers (FEDs). During 2013 and 2014 will undergo a major upgrade to address obsolescence current hardware requirements posed by LHC accelerator various components. For loss-less collection FEDs new FPGA based card implementing TCP/IP protocol suite over 10Gbps Ethernet has been developed. To limit TCP implementation complexity group developed simplified unidirectional but RFC 793...
The CMS data acquisition system is designed to build and filter events originating from 476 detector sources at a maximum trigger rate of 100 kHz. Different architectures switch technologies have been evaluated accomplish this purpose. Events will be built in two stages: the first stage set event builders called front-end driver (FED) builders. These based on Myrinet technology pre-assemble groups about eight sources. second readout perform building full events. A single builder 60 16 kB...
Summary form only given. The data acquisition system (DAQ) of the CMS experiment at CERN Large Hadron Collider assembles events a rate 100 kHz, transporting event an aggregate throughput GB/s to high level trigger (HLT) farm. HLT farm selects interesting for storage and offline analysis around 1 kHz. DAQ has been redesigned during accelerator shutdown in 2013/14. motivation is twofold: Firstly, current compute nodes, networking, infrastructure will have reached end their lifetime by time LHC...
The CMS data acquisition (DAQ) is implemented as a service-oriented architecture where DAQ applications, well general applications such monitoring and error reporting, are run self-contained services. task of deployment operation services achieved by using several heterogeneous facilities, custom configuration scripts in languages. In this work, we restructure the existing system into homogeneous, scalable cloud adopting uniform paradigm, all orchestrated environment with standardized...
The Compact Muon Solenoid (CMS) experiment at CERN incorporates one of the highest throughput data acquisition systems in world and is expected to increase its by more than a factor ten for High-Luminosity phase Large Hadron Collider (HL-LHC). To achieve this goal, system will be upgraded most components. Among them, event builder software, charge assembling all read out from different sub-detectors, planned modified single an orbit that assembles multiple events same time. increased current...