- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Dark Matter and Cosmic Phenomena
- Computational Physics and Python Applications
- Cosmology and Gravitation Theories
- Neutrino Physics Research
- Distributed and Parallel Computing Systems
- Black Holes and Theoretical Physics
- Particle Accelerators and Free-Electron Lasers
- Scientific Computing and Data Management
- Big Data Technologies and Applications
- Parallel Computing and Optimization Techniques
- Noncommutative and Quantum Gravity Theories
- Medical Imaging Techniques and Applications
- Gamma-ray bursts and supernovae
- Atomic and Subatomic Physics Research
- Stochastic processes and financial applications
- Astrophysics and Cosmic Phenomena
- Green IT and Sustainability
- Nuclear reactor physics and engineering
- International Science and Diplomacy
- Optical properties and cooling technologies in crystalline materials
- Advanced Data Storage Technologies
FH Aachen
2024-2025
RWTH Aachen University
2020-2025
Princeton University
2025
Institute of High Energy Physics
2022-2024
University of Antwerp
2024
A. Alikhanyan National Laboratory
2022-2024
Deutsches Elektronen-Synchrotron DESY
2021
Abstract We introduce a Python package that provides simple and unified access to collection of datasets from fundamental physics research—including particle physics, astroparticle hadron- nuclear physics—for supervised machine learning studies. The contain hadronic top quarks, cosmic-ray-induced air showers, phase transitions in matter, generator-level histories. While public multiple disciplines already exist, the common interface provided reference models simplify future work on...
Abstract The development of an LHC physics analysis involves numerous investigations that require the repeated processing terabytes data. Thus, a rapid completion each these cycles is central to mastering science project. We present solution efficiently handle and accelerate analyses on small-size institute clusters. Our uses three key concepts: vectorized collision events, “MapReduce” paradigm for scaling out computing clusters, utilized SSD caching reduce latencies in IO operations. This...
The second PyHEP.dev workshop, part of the "Python in HEP Developers" series organized by Software Foundation (HSF), took place Aachen, Germany, from August 26 to 30, 2024. This gathering brought together nearly 30 Python package developers, maintainers, and power users engage informal discussions about current trends Python, with a primary focus on analysis tools techniques High Energy Physics (HEP). workshop agenda encompassed range topics, such as defining scope data analysis, exploring...
Research on Universe and Matter (ErUM), conducted at major infrastructures such as CERN large observatories in collaboration with university groups, plays an important role driving the digital transformation for future. The German action plan "ErUM-Data" [2] promotes this through interdisciplinary networking financial support of about 20.000 scientists. ErUM-Data-Hub [4] serves a central transfer office to meet these ambitions. One task is designing, organizing performing schools workshops...
Abstract Many particle physics analyses are adopting the concept of vectorised computing, often making them increasingly performant and resource-efficient. While a variety computing steps can be directly, some calculations challenging to implement. One these is analytical neutrino reconstruction which involves fitting that naturally varies between events. We show implementation using graph model. It uses established deep learning software libraries natively portable local external hardware...
Given the urgency to reduce fossil fuel energy production make climate tipping points less likely, we call for resource-aware knowledge gain in research areas on Universe and Matter with emphasis digital transformation. A portfolio of measures is described detail then summarized according timescales required their implementation. The will both contribute sustainable accelerate scientific progress through increased awareness resource usage. This work based a three-days workshop sustainability...
The VISPA (VISual Physics Analysis) project provides a streamlined work environment for physics analyses and hands-on teaching experiences with focus on deep learning. has already been successfully used in HEP is now being further developed into an interactive learning platform. One specific example to meet knowledge sharing needs by combining paper, code data at central place. Additionally the possibility run it directly from web browser key feature of this development. Any SSH reachable...
The development of an LHC physics analysis involves numerous investigations that require the repeated processing terabytes data. Thus, a rapid completion each these cycles is central to mastering science project. We present solution efficiently handle and accelerate analyses on small-size institute clusters. Our based three key concepts: Vectorized collision events, "MapReduce" paradigm for scaling out computing clusters, utilized SSD caching reduce latencies in IO operations. Using...