- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Dark Matter and Cosmic Phenomena
- Computational Physics and Python Applications
- Cosmology and Gravitation Theories
- Neutrino Physics Research
- Distributed and Parallel Computing Systems
- Scientific Computing and Data Management
- Astrophysics and Cosmic Phenomena
- Advanced Data Storage Technologies
- Parallel Computing and Optimization Techniques
- Black Holes and Theoretical Physics
- Atomic and Subatomic Physics Research
- Big Data Technologies and Applications
- Cloud Computing and Resource Management
- Peer-to-Peer Network Technologies
- Particle Accelerators and Free-Electron Lasers
- Optical properties and cooling technologies in crystalline materials
- Medical Imaging Techniques and Applications
- Data Visualization and Analytics
- Stochastic processes and financial applications
- Noncommutative and Quantum Gravity Theories
- Nuclear physics research studies
Karlsruhe Institute of Technology
2019-2025
Institute of High Energy Physics
2021-2023
RWTH Aachen University
2016-2019
Czech Academy of Sciences, Institute of Physics
2019
Scientific collaborations require a strong computing infrastructure to successfully process and analyze data. While large-scale have access resources such as Analysis Facilities, small-scale often lack the establish maintain an instead operate with fragmented analysis environments, resulting in inefficiencies, hindering reproducibility thus creating additional challenges for collaboration that are not related experiment itself. We present scalable, lightweight maintainable Facility developed...
Abstract Increasing computing demands and concerns about energy efficiency in high-performance high-throughput are driving forces the search for more efficient ways to use available resources. Sharing resources of an underutilised cluster with a high workload increases cluster. The software COBalD/TARDIS can dynamically transparently integrate disintegrate such However, sharing also requires accounting. AUDITOR ( A cco u nting D ata Handl i ng T oolbox O pportunistic R esources) is modular...
Increased operational effectiveness and the dynamic integration of only temporarily available compute resources (opportunistic resources) becomes more important in next decade, due to scarcity for future high energy physics experiments as well desired cloud performance computing resources. This results a heterogenous environment, which gives rise huge challenges operation teams experiments. At Karlsruhe Institute Technology (KIT) we design solutions tackle these challenges. In order ensure...
High throughput and short turnaround cycles are core requirements for efficient processing of data-intense end-user analyses in Energy Physics (HEP). Together with the tremendously increasing amount data to be processed, this leads enormous challenges HEP storage systems, networks distribution computing resources analyses. Bringing close resource is a very promising approach solve limitations improve overall performance. However, achieving locality by placing multiple conventional caches...
Demand for computing resources in high energy physics (HEP) shows a highly dynamic behavior, while the provided by Worldwide LHC Computing Grid (WLCG) remains static. It has become evident that opportunistic such as High Performance (HPC) centers and commercial clouds are well suited to cover peak loads. However, utilization of these gives rise new levels complexity, e.g. need be managed dynamically HEP applications require very specific software environment usually not at resources....
VISPA (Visual Physics Analysis) is a web-platform that enables users to work on any secure shell (SSH) reachable resource using just their webbrowser. It used successfully in research and education for HEP data analysis. The emerging JupyterLab an ideal choice comprehensive, browser-based, extensible environment we seek unify it with the efforts of VISPA-project. primary objective provide user freedom access external resources at disposal, while maintaining smooth integration preconfigured...
The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides powerful extension mechanism that enables to interface wide range of applications. Beyond basic applications such as code editor, file browser, or terminal, it meets demands sophisticated experiment-specific use cases focus physics data analyses typically require high degree interactivity. As an example, we developed inspector capable browsing...
Current and future end-user analyses workflows in High Energy Physics demand the processing of growing amounts data. This plays a major role when looking at demands context High-Luminosity-LHC. In order to keep time turn-around cycles as low possible analysis clusters optimized with respect these can be used. Since hyper converged servers offer good combination compute power local storage, they form ideal basis for clusters. this contribution we report on setup commissioning dedicated...
Abstract The German CMS community (DCMS) as a whole can benefit from the various compute resources, available to its different institutes. While Grid-enabled and National Analysis Facility resources are usually shared within community, local recently enabled opportunistic like HPC centers cloud not. Furthermore, there is no submission infrastructure available. Via HTCondor’s [1] mechanisms connect resource pools, several remote pools be connected transparently users therefore used more...
Abstract Data-intensive end-user analyses in high energy physics require data throughput to reach short turnaround cycles. This leads enormous challenges for storage and network infrastructure, especially when facing the tremendously increasing amount of be processed during High-Luminosity LHC runs. Including opportunistic resources with volatile systems into traditional HEP computing facilities makes this situation more complex. Bringing close units is a promising approach solve limitations...
Abstract The current experiments in high energy physics (HEP) have a huge data rate. To convert the measured data, an enormous number of computing resources is needed and will further increase with upgraded newer experiments. fulfill ever-growing demand allocation additional, potentially only temporary available non-HEP dedicated important. These so-called opportunistic cannot be used for analyses general but are also well-suited to cover typical unpredictable peak demands resources. For...
Computing resource needs are expected to increase drastically in the future. The HEP experiments ATLAS and CMS foresee an of a factor 5-10 volume recorded data upcoming years. current infrastructure, namely WLCG, is not sufficient meet demands terms computing storage resources. usage non specific resources one way reduce this shortage. However, using them comes at cost: First, with multiple such hand, it gets more diffcult for single user, as each normally requires its own authentication has...
The Visual Physics Analysis (VISPA) software is a toolbox for accessing analysis via the web. It based on latest web technologies and provides powerful extension mechanism that enables to interface wide range of applications. especially meets demands sophisticated experiment-specific use cases focus physics data analyses typically require high degree interactivity. As an example, we developed inspector which capable browsing interactively through event content several formats, e.g., MiniAOD...
Given the urgency to reduce fossil fuel energy production make climate tipping points less likely, we call for resource-aware knowledge gain in research areas on Universe and Matter with emphasis digital transformation. A portfolio of measures is described detail then summarized according timescales required their implementation. The will both contribute sustainable accelerate scientific progress through increased awareness resource usage. This work based a three-days workshop sustainability...
We present the novel Analysis Workflow Management (AWM) that provides users with tools and competences of professional large scale workflow systems, e.g. Apache's Airavata[1]. The approach presents a paradigm shift from executing parts analysis to defining analysis. Within AWM an consists steps. For example, step defines run certain executable for multiple files input data collection. Each call one those can be submitted desired location, which could local computer or remote batch system. An...
In particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such Monte Carlo production. However, physicists performing data analyses usually required to steer their individual workflows manually, which is time-consuming and often leads undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses. The approach presents paradigm shift from...