Knowledge graph embedding closed under composition
Compositional data
DOI:
10.1007/s10618-024-01050-x
Publication Date:
2024-07-04T18:01:59Z
AUTHORS (9)
ABSTRACT
Abstract Knowledge Graph Embedding (KGE) has attracted increasing attention. Relation patterns, such as symmetry and inversion, have received considerable focus. Among them, composition patterns are particularly important, they involve nearly all relations in KGs. However, prior KGE approaches often consider to be compositional only if well-represented the training data. Consequently, it can lead performance degradation, especially for under-represented patterns. To this end, we propose HolmE, a general form of with its relation embedding space closed under composition, namely that any two given embeddings remains within space. This property ensures every compose, or composed by other embeddings. It enhances HolmE’s capability model (also called long-tail) limited learning instances. our best knowledge, work is pioneering discussing being composition. We provide detailed theoretical proof extensive experiments demonstrate notable advantages HolmE modelling long-tail Our results also highlight effectiveness extrapolating unseen through state-of-the-art on benchmark datasets.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (47)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....