Thomas Jejkal

ORCID: 0000-0003-2804-688X
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Distributed and Parallel Computing Systems
  • Scientific Computing and Data Management
  • Research Data Management Practices
  • Advanced Data Storage Technologies
  • Parallel Computing and Optimization Techniques
  • Medical Imaging Techniques and Applications
  • Advanced X-ray Imaging Techniques
  • Data Quality and Management
  • Image Processing and 3D Reconstruction
  • Semantic Web and Ontologies
  • Particle Detector Development and Performance
  • Advanced MRI Techniques and Applications
  • Dark Matter and Cosmic Phenomena
  • Service-Oriented Architecture and Web Services
  • Cloud Computing and Resource Management
  • Embedded Systems Design Techniques
  • Computational Physics and Python Applications
  • Advanced X-ray and CT Imaging
  • Biomedical Text Mining and Ontologies
  • Network Time Synchronization Technologies
  • Advanced Database Systems and Queries
  • Business Process Modeling and Analysis
  • Big Data Technologies and Applications
  • Opportunistic and Delay-Tolerant Networks
  • Simulation Techniques and Applications

Karlsruhe Institute of Technology
2010-2022

Forschungszentrum Jülich
2022

Deutsches Zentrum für Luft- und Raumfahrt e. V. (DLR)
2022

German Cancer Research Center
2022

Heidelberg University
2022

Helmholtz-Zentrum Berlin für Materialien und Energie
2022

GEOMAR Technologie GmbH - GTG
2022

Kerntechnische Entsorgung Karlsruhe (Germany)
2022

Technische Universität Dresden
2018

FZI Research Center for Information Technology
2006-2010

A project has beenstarted to create a distributed testbed based on FDOs and using the DOIP protocol. This paper describes intentions required components build such testbed. It can beseen as first step towards an international FDO

10.52825/ocp.v5i.1195 article EN cc-by Open Conference Proceedings 2025-03-18

The Large Scale Data Facility (LSDF) at the Karlsruhe Institute of Technology was started end 2009 with aim supporting growing requirements data intensive experiments. In close cooperation involved scientific communities, LSDF provides them not only adequate storage space but a directly attached analysis farm and - more importantly value added services for their big data-sets. Analysis workflows are supported through mixed Hadoop Open Nebula Cloud environments to storage, enable efficient...

10.1109/ipdps.2011.286 article EN 2011-05-01

Nowadays, the daily work of many research communities is characterized by an increasing amount and complexity data. This makes it increasingly difficult to manage, access utilize data ultimately gain scientific insights based on it. At same time, domain scientists want focus their science instead IT. The solution management store in a structured way enabling easy discovery for future reference usage. An integral part use metadata. With it, becomes accessible its content context name location...

10.1016/j.future.2017.12.023 article EN cc-by-nc-nd Future Generation Computer Systems 2018-01-31

To cope with the growing requirements of data intensive scientific experiments, models and simulations Large Scale Data Facility(LSDF) at KIT aims to support many disciplines. The LSDF is a distributed storage facility Exabyte scale providing storage, archives, bases meta repositories. Open interfaces APIs variety access methods highly available services for high throughput applications. Tools an easy transparent allow scientists use without bothering internal structures technologies. In...

10.1109/pdp.2011.59 article EN 2011-02-01

Digital methods, tools and algorithms are gaining in importance for the analysis of digitized manuscript collections arts humanities. One example is BMBF-funded research project "eCodicology" which aims to design, evaluate optimize automatic identification macro- micro-structural layout features medieval manuscripts. The main goal this provide better insights into high-dimensional datasets manuscripts humanities scholars. heterogeneous nature size data need create a database automatically...

10.1117/12.2076124 article EN Proceedings of SPIE, the International Society for Optical Engineering/Proceedings of SPIE 2015-02-08

To cope with the growing requirements of data intensive scientific experiments, models and simulations Large Scale Data Facility (LSDF) at KIT aims to support many disciplines. The LSDF is a distributed storage facility Exabyte scale providing storage, archives, bases meta repositories. Apart from communities need perform processing operations as well. For this purpose Execution Framework for Intensive Applications (LAMBDA) was developed allow asynchronous high-performance next LSDF....

10.1109/pdp.2012.69 article EN 2012-02-01

The Large Scale Data Facility (LSDF) was conceived and launched at the Karlsruhe Institute of Technology (KIT) end 2009 to address growing need for value-added storage services data intensive experiments. LSDF main focus is support scientific experiments producing large sets reaching into petabyte range with adequate storage, value added management, processing preservation. In this work we describe approach taken perform analysis in LSDF, as well management datasets.

10.1109/ldav.2011.6092331 article EN IEEE Symposium on Large Data Analysis and Visualization 2011-10-01

Exponential growth in scientific research data demands novel measures for managing the extremely large datasets. In particular, due to advancements high-resolution microscopy, nanoscopy community is producing datasets up range of multiple TeraBytes (TB). Systematically acquired biological specimens are composed images, 150-200 TB. The management these requires an optimized Generic Client Service (GCS) API with integration into a repository system. proposed this paper provides abstract...

10.1109/bigdataservice.2015.25 article EN 2015-03-01

Nowadays, the daily work of many research communities is characterized by an increasing amount and complexity data. This makes it increasingly difficult to manage, access utilize ultimately gain scientific insights based on it. At same time, domain scientists want focus their science instead IT. The solution data management in order store a structured way enable easy discovery for future reference. An integral part use metadata. With it, becomes accessible its content only name location....

10.7287/peerj.preprints.2831v1 preprint EN 2017-02-24

The D-Grid reference installation is a test platform for the German grid initiative. main task to create prototype software and hardware components needed in community. For each grid-related field different alternative middleware included. With respect changing demands from community, new versions of are released every six months.

10.1088/1742-6596/219/6/062044 article EN Journal of Physics Conference Series 2010-04-01

Grid is a rapidly growing new technology that will provide easy access to vast amounts of computer resources, both hardware and software. As these resources become available soon, more scientific users are interested in benefiting from them. At this time the main problem accessing user usually need know lot about methods technologies besides their own field application. This paper describes toolkit which based om Services designed especially for process data processing providing database...

10.1109/e-science.2006.76 article EN International Conference on e-Science 2006-12-04

The presented dynamic grid service architecture provides a novel and comfortable access for scientific software developers users without prior knowledge of technologies or even the underlying architecture. A simple Java API allows extension WSRF-compliant Web services by components deployed automatically on available GT4 containers. In preparation only two are started each container allowing hot-deployment performance analysis. GridIJ is reference implementation with problem solving...

10.1109/euromicro.2007.32 article EN 2007-08-01

In this paper, the method of data intensive computing is studied for large amounts in computed tomography (CT). An automatic workflow built up to connect beamline ANKA with scale facility (LSDF), able enhance storage and analysis efficiency. workflow, paper focuses on parallel 3D reconstruction. Different from existing reconstruction system filtered back-projection method, an algebraic technique based compressive sampling theory presented reconstruct ultrafast fewer projections. Then...

10.1109/pdp.2013.21 article EN 2013-02-01

Digital methods and collaborative research in virtual environments are gaining importance for the arts humanities. The EU-funded project DARIAH aims to enhance support digitally-enabled across these disciplines. most basic but nevertheless fundamental task of is provide sustainable storage data. Information contained data like images, texts or music needs be secured remain accessible even if original information carrier becomes lost corrupted. heterogeneity humanistic need distributed,...

10.1109/pdp.2012.71 article EN 2012-02-01

Nowadays, the daily work of many research communities is characterized by an increasing amount and complexity data. This makes it increasingly difficult to manage, access utilize ultimately gain scientific insights based on it. At same time, domain scientists want focus their science instead IT. The solution data management in order store a structured way enable easy discovery for future reference. An integral part use metadata. With it, becomes accessible its content only name location....

10.7287/peerj.preprints.2831 preprint EN 2017-02-24

The Helmholtz Association (Anonymous 2022d), the largest association of large-scale research centres in Germany, covers a wide range fields employing more than 43.000 researchers. In 2019, Metadata Collaboration (HMC) 2022f) Platform as joint endeavor across all areas was started to make depth and breadth data produced by Centres findable, accessible, interoperable, reusable (FAIR) for whole science community. To reach this goal, concept FAIR Digital Objects (FAIR DOs) has been chosen...

10.3897/rio.8.e94758 article EN cc-by Research Ideas and Outcomes 2022-10-12

Grid is a rapidly growing new technology that will provide easy access to vast amounts of computer resources, both hardware and software. As these resources become available soon, more scientific users are interested in benefiting from them. At this time the main problem accessing user usually need know lot about methods technologies besides their own field application. This paper describes toolkit which based om Services designed especially for process data processing providing database...

10.1109/e-science.2006.261046 article EN 2006-12-01

Parallel and high performance computing experts are obsessed with scalability. Performance analysis tuning important complex but there a number of software tools to support this. One methodology is the detailed recording parallel runtime behavior in event traces their subsequent analysis. This regularly produces very large data sets own challenges for handling management. paper evaluates utilization MASi research management service as trace repository store, manage, find an efficient usable...

10.1016/j.procs.2017.05.190 article EN Procedia Computer Science 2017-01-01
Coming Soon ...