J. Ebke

ORCID: 0000-0003-1097-4733
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Particle physics theoretical and experimental studies
  • High-Energy Particle Collisions Research
  • Particle Detector Development and Performance
  • Quantum Chromodynamics and Particle Interactions
  • Dark Matter and Cosmic Phenomena
  • Cosmology and Gravitation Theories
  • Computational Physics and Python Applications
  • Distributed and Parallel Computing Systems
  • Neutrino Physics Research
  • Black Holes and Theoretical Physics
  • Astrophysics and Cosmic Phenomena
  • Advanced Data Storage Technologies
  • advanced mathematical theories
  • Medical Imaging Techniques and Applications
  • Scientific Computing and Data Management
  • Particle Accelerators and Free-Electron Lasers
  • Superconducting Materials and Applications
  • Muon and positron interactions and applications
  • Advanced MRI Techniques and Applications
  • Nuclear reactor physics and engineering
  • Particle accelerators and beam dynamics
  • Big Data Technologies and Applications
  • Artificial Intelligence in Healthcare
  • Radiation Detection and Scintillator Technologies
  • SARS-CoV-2 detection and testing

Ludwig-Maximilians-Universität München
2009-2016

The University of Adelaide
2013-2015

University of Belgrade
2014

Yale University
2010-2013

European Organization for Nuclear Research
2010-2012

LMU Klinikum
2012

Istanbul Technical University
2011

Max Planck Innovation
2011

Max Planck Society
2011

Lawrence Berkeley National Laboratory
2011

The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters computers, using the file system (HDFS) storage and backup MapReduce as platform. primarily designed textual which can be processed in arbitrary chunks, must adapted to use case binary files cannot split automatically. However, offers attractive features terms fault tolerance, task supervision control, multi-user functionality job management. For this reason, we evaluated an...

10.1088/1742-6596/513/3/032054 article EN Journal of Physics Conference Series 2014-06-11

A job submission and management tool is one of the necessary components in any distributed computing system. Such a should provide user-friendly interface for physics production groups ordinary analysis users to access heterogeneous resources, without requiring knowledge underlying grid middleware. Ganga, with its common framework customizable plug-in structure such tool. This paper will describe how experiment-specific tools BESIII SuperB were developed as Ganga plug-ins meet their own...

10.1088/1742-6596/396/3/032120 article EN Journal of Physics Conference Series 2012-12-13

Ganga is a grid job submission and management system widely used in the ATLAS LHCb experiments several other communities context of EGEE project. The particle physics have entered LHC operation era which brings new challenges for user data analysis: strong growth number users jobs already noticeable. Current work project focusing on dealing with these challenges. In recent releases support pilot based systems Panda Dirac experiment respectively been strengthened. A more scalable repository...

10.1088/1742-6596/331/7/072011 article EN Journal of Physics Conference Series 2011-12-23

Ganga is the main end-user distributed analysis tool for ATLAS and LHCb experiments provides foundation layer HammerCloud system, used by LHC validation stress testing of their numerous computing facilities. Here we illustrate recent developments demonstrate how tools that were initially developed a specific user community have been migrated into core, so can be exploited wider user-base. Similarly, examples will given where components adapted use communities in custom packages.

10.1088/1742-6596/396/3/032061 article EN Journal of Physics Conference Series 2012-12-13

In this paper, we present the High Energy Physics data format, processing toolset and analysis library a4, providing fast I/O of structured using Google protocol buffer library. The overall goal a4 is to provide physicists with tools work efficiently billions events, not only high speeds, but also automatic metadata handling, a set UNIX-like operate on files, powerful histogramming capabilities. At present, an experimental project, it has already been used by authors in preparing physics...

10.1088/1742-6596/396/2/022012 article EN Journal of Physics Conference Series 2012-12-13

The focus of many software architectures the LHC experiments is to deliver a well-designed Event Data Model (EDM). Changes and additions stored data are often expensive, requiring large amounts CPU time, disk storage man-power. In addition, differing needs between groups physicists lead tendency for common formats grow in terms contained information whilst still not managing service all needs. We introduce new way thinking about model based on Dremel column store architecture published by...

10.1088/1742-6596/513/4/042016 article EN Journal of Physics Conference Series 2014-06-11
Coming Soon ...