Jacopo Urbani

ORCID: 0000-0002-0717-3559
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Semantic Web and Ontologies
  • Advanced Database Systems and Queries
  • Logic, Reasoning, and Knowledge
  • Data Quality and Management
  • Advanced Graph Neural Networks
  • Topic Modeling
  • Data Management and Algorithms
  • Natural Language Processing Techniques
  • Graph Theory and Algorithms
  • Scientific Computing and Data Management
  • Web Data Mining and Analysis
  • AI-based Problem Solving and Planning
  • Service-Oriented Architecture and Web Services
  • Distributed and Parallel Computing Systems
  • Biomedical Text Mining and Ontologies
  • Cryptography and Data Security
  • Advanced Data Storage Technologies
  • Cloud Computing and Resource Management
  • Misinformation and Its Impacts
  • Speech Recognition and Synthesis
  • Advanced Text Analysis Techniques
  • Anomaly Detection Techniques and Applications
  • Rough Sets and Fuzzy Logic
  • Multi-Agent Systems and Negotiation
  • Genomics and Phylogenetic Studies

Vrije Universiteit Amsterdam
2015-2024

Max Planck Institute for Informatics
2015-2016

Amsterdam UMC Location Vrije Universiteit Amsterdam
2016

IBM Research - Ireland
2012

Fact-checking is a crucial task for accurately populating, updating and curating knowledge graphs. Manually validating candidate facts time-consuming. Prior work on automating this focuses estimating truthfulness using numerical scores which are not human-interpretable. Others extract explicit mentions of the fact in text as an evidence fact, can be hard to directly spot. In our work, we introduce ExFaKT, framework focused generating human-comprehensible explanations facts. ExFaKT uses...

10.1145/3289600.3290996 article EN 2019-01-30

The evaluation of Datalog rules over large Knowledge Graphs (KGs) is essential for many applications. In this paper, we present a new method materializing inferences, which combines column-based memory layout with novel optimization methods that avoid redundant inferences at runtime. pro-active caching certain subqueries further increases efficiency. Our empirical shows approach can often match or even surpass the performance state-of-the-art systems, especially under restricted resources.

10.1609/aaai.v30i1.9993 article EN Proceedings of the AAAI Conference on Artificial Intelligence 2016-02-21

SUMMARY The Semantic Web contains many billions of statements, which are released using the resource description framework (RDF) data model. To better handle these large amounts data, high performance RDF applications must apply a compression technique. Unfortunately, because input size, even this is challenging. In paper, we propose set distributed MapReduce algorithms to efficiently compress and decompress amount data. Our approach uses dictionary encoding technique that maintains...

10.1002/cpe.2840 article EN Concurrency and Computation Practice and Experience 2012-04-23

The Semantic Web consists of many billions statements made terms that are either URIs or literals. Since these usually consist long sequences characters, an effective compression technique must be used to reduce the data size and increase application performance. One best known techniques for is dictionary encoding. In this paper we propose a MapReduce algorithm efficiently compresses decompresses large amount data. We have implemented prototype using Hadoop framework report evaluation shows...

10.1145/1851476.1851591 article EN 2010-06-21

Currently, MapReduce is the most popular programming model for large-scale data processing and this motivated research community to improve its efficiency either with new extensions, algorithmic optimizations, or hardware. In paper we address two main limitations of MapReduce: one relates model's limited expressiveness, which prevents implementation complex programs that require multiple steps iterations. The other implementations (e.g., Hadoop), provide good resource utilization only...

10.1109/icdcs.2014.62 article EN 2014-06-01

In the last few years a new research area, called stream reasoning, emerged to bridge gap between reasoning and processing. While current approaches are designed work on mainly static data, Web is, other hand, extremely dynamic: information is frequently changed updated, data continuously generated from huge number of sources, often at high rate. words, fresh constantly made available in form streams updates. Despite some promising investigations still its infancy, both perspective models...

10.2139/ssrn.3199091 article EN SSRN Electronic Journal 2014-01-01

Commonsense knowledge about part-whole relations (e.g., screen partOf notebook) is important for interpreting user input in web search and question answering, or object detection images. Prior work on base construction has compiled assertions, but with substantial limitations: i) semantically different kinds of are conflated into a single generic relation, ii) the arguments assertion merely words ambiguous meaning, iii) assertions lack additional attributes like visibility nose visible...

10.1609/aaai.v30i1.9992 article EN Proceedings of the AAAI Conference on Artificial Intelligence 2016-02-21

Abstract As more and data is being generated by sensor networks, social media organizations, the Web interlinking this wealth of information becomes complex. This particularly true for so-called Data, in which semantically enriched interlinked using ontologies. In large uncoordinated environment, reasoning can be used to check consistency associated ontologies, or infer logical consequences which, turn, obtain new insights from data. However, approaches need scalable order enable over entire...

10.1017/s0269888918000255 article EN The Knowledge Engineering Review 2018-01-01

In order to accurately populate and curate Knowledge Graphs (KGs), it is important distinguish ?s?p?o? facts that can be traced back sources from cannot verified. Manually validating each fact time-consuming. Prior work on automating this task relied numerical confidence scores which might not easily interpreted. To overcome limitation, we present Tracy, a novel tool generates human-comprehensible explanations for candidate facts. Our relies background knowledge in the form of rules rewrite...

10.1145/3308558.3314126 article EN 2019-05-13
Coming Soon ...