Matthias Volk

ORCID: 0000-0002-4835-919X
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Big Data and Business Intelligence
  • Software System Performance and Reliability
  • Data Quality and Management
  • Big Data Technologies and Applications
  • Cloud Computing and Resource Management
  • Digital Transformation in Industry
  • Scientific Computing and Data Management
  • Distributed and Parallel Computing Systems
  • Business Process Modeling and Analysis
  • Consumer Retail Behavior Studies
  • Semantic Web and Ontologies
  • Smart Grid Security and Resilience
  • Advanced Software Engineering Methodologies
  • Software Reliability and Analysis Research
  • Software Engineering Research
  • Economic and Technological Systems Analysis
  • RFID technology advancements
  • Digital Innovation in Industries
  • Software Engineering Techniques and Practices
  • IoT and Edge/Fog Computing
  • Scheduling and Optimization Algorithms
  • Particle physics theoretical and experimental studies
  • FinTech, Crowdfunding, Digital Finance
  • Digital Platforms and Economics
  • Software Testing and Debugging Techniques

Otto-von-Guericke University Magdeburg
2016-2023

University of Twente
2022

Nowadays AM is a rapidly growing and emerging discipline in manufacturing, as well AI informational applications. Both are related to logistical self-referential/-copying concepts which make them scalable. What osmotic mass production spreading AI-related Cyber-Physical Systems (CPS) the computational approach. AI-AM self-propagatedly framed itself an field, can be logically or systematically unified. The paper investigates firstly recent developments field of process flow how it result list...

10.1016/j.procs.2021.04.161 article EN Procedia Computer Science 2021-01-01

Feynman diagrams constitute one of the essential ingredients for making precision predictions collider experiments. Yet, while simplest can be evaluated in terms multiple polylogarithms -- whose properties as special functions are well understood more complex often involve integrals over complicated algebraic manifolds. Such already contribute at NNLO to self-energy electron, $t \bar{t}$ production, $\gamma \gamma$ and Higgs decay, appear two loops planar limit maximally supersymmetric...

10.48550/arxiv.2203.07088 preprint EN other-oa arXiv (Cornell University) 2022-01-01

The assessment of the exploration potential and high upfront costs are significant barriers to deployment geothermal energy exploitation. DEGREE project aims enhance success rate such projects through advancing methodologies development a virtual digital underground laboratory. This laboratory leverages twin target area, test site in East Eifel region Germany. covers entire workflow, from data collection processing geological coupled hydro-thermo-mechanical modeling, as well visualisation...

10.5194/egusphere-egu25-15245 preprint EN 2025-03-15

In this research article, the buzzwords of digital thread, twin, and Industry 4.0 are examined by means a systematic literature review. The key concepts shaping these paradigms investigated to achieve an overview existing solutions. First, body is explored provide general observations on similarities differences between concepts. Subsequently, technologies provided that necessary vision thread. Based identified technologies, state-of-the-art solutions relating thread discussed. Finally, work...

10.1016/j.procs.2022.12.387 article EN Procedia Computer Science 2023-01-01

This contribution examines the terms of big data and engineering, considering specific characteristics challenges. Deduced by those, it concludes need for new ways to support creation corresponding systems help in reaching its full potential. In following, state art is analysed subdomains engineering solutions are presented. end, a possible concept filling identified gap proposed future perspectives highlighted.

10.5220/0007748803510358 article EN cc-by-nc-nd 2019-01-01

Big data has evolved to a ubiquitous part of today's society. However, despite its popularity, the development and testing corresponding applications are still very challenging tasks that being actively researched in pursuit ways for improvement. One newly introduced proposition is application test driven (TDD) big domain. To facilitate this concept, existing literature reviews on TDD have been analyzed extract insights from those sources aggregated knowledge, which can be applied new...

10.1109/icit52682.2021.9491728 article EN 2021-07-14

Big data attracts researchers and practitioners around the globe in their desire to effectively manage deluge resulting from ongoing evolution of information systems domain. Consequently, many decision makers attempt harness potentials arising with use those modern technologies a multitude application scenarios. As result, big has gained an important role for businesses. However, as today, developed solutions are oftentimes perceived completed products, without considering that highly...

10.7250/csimq.2020-23.05 article EN cc-by Complex Systems Informatics and Modeling Quarterly 2020-07-31

The retailing industry witnessed a significant shift in the past years, which introduced modifications standard procedures and regular practices of supply chains (SC). These necessary worldwide distress caused major disruptions instabilities SC. Organizations started developing digital-transformation strategies, include integration analysis external data sources to detect SC retain competitive advantage market. Developing such strategies requires exploring current technologies investigating...

10.1016/j.procs.2022.12.386 article EN Procedia Computer Science 2023-01-01

The analysis of formal models that include quantitative aspects such as timing or probabilistic choices is performed by verification tools. Broad and mature tool support available for computing basic properties expected rewards on Markov chains. Previous editions QComp, the comparison tools models, focused this setting. Many application scenarios, however, require more advanced property types LTL parameter synthesis queries well like stochastic games partially observable MDPs. For these, in...

10.48550/arxiv.2405.13583 preprint EN arXiv (Cornell University) 2024-05-22

The success of big data projects depends heavily on the ability to decide which technological expertise and architectures are required in such a context. However, there is confusion scientific discourse regarding systemization technologies exacerbates this problem. Therefore, paper, structured literature review conducted assess current state art give an overview about technology classifications. It can be stated that only very limited approaches for classifications exist, partially...

10.1109/cbi.2017.26 article EN 2017-07-01

Big data is considered as one of the most promising technological advancements in last decades. Today it used for a multitude intensive projects various domains and also serves technical foundation other recent trends computer science domain. However, complexity its implementation utilization renders adoption sophisticated endeavor. For this reason, not surprising that potential users are often overwhelmed tend to rely on existing guidelines best practices successfully realize monitor their...

10.1109/access.2020.3028127 article EN cc-by IEEE Access 2020-01-01

Today, the amount and complexity of data that is globally produced increases continuously, surpassing abilities traditional approaches. Therefore, to capture analyze those data, new concepts techniques are utilized engineer powerful big systems. However, despite existence sophisticated approaches for engineering systems, testing not sufficiently researched. Hence, in this contribution, a comparison software testing, as common procedure, requirements drawn. The determined specificities domain...

10.1109/sitis.2019.00055 article EN 2019-11-01

Big Data is a term that gained popularity due to its potential benefits in various fields, and progressively being used. However, there are still many gaps challenges overcome, especially when it comes the selection handling of relevant technologies. A consequence huge number manifestations this area, growing each year, uncertainty complexity increase. The lack classification approach causes demand for more experts with broad knowledge expertise. Using techniques ontology engineering...

10.4018/ijiit.2020040103 article EN International Journal of Intelligent Information Technologies 2020-02-28

Microservices and Big Data are renowned hot topics in computer science that have gained a lot of hype. While the use microservices is an approach used modern software development to increase flexibility, allows organizations turn today’s information deluge into valuable insights. Many those architectures rather monolithic elements. However, new trend arises which replaced with more modularized ones, such as microservices. This transformation provides benefits from modularity, evolutionary...

10.52825/bis.v1i.67 article EN cc-by Business Information Systems 2021-07-02

The number of online purchases is increasing constantly. Companies have recognized the related opportunities and they are using channels progressively. In order to acquire potential customers, companies often try gain a better understanding through use web analytics. One most useful sources log files. Basically, these provide an abundance important information about user behavior on website, such as path or access time. Mining this so-called clickstream data in comprehensive way has become...

10.1109/bigdatacongress.2017.60 article EN 2017-06-01
Coming Soon ...