RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space
FOS: Computer and information sciences
Computer Science - Machine Learning
Computer Science - Computation and Language
Statistics - Machine Learning
0202 electrical engineering, electronic engineering, information engineering
Machine Learning (stat.ML)
02 engineering and technology
Computation and Language (cs.CL)
Machine Learning (cs.LG)
DOI:
10.48550/arxiv.1902.10197
Publication Date:
2019-01-01
AUTHORS (4)
ABSTRACT
We study the problem of learning representations of entities and relations in knowledge graphs for predicting missing links. The success of such a task heavily relies on the ability of modeling and inferring the patterns of (or between) the relations. In this paper, we present a new approach for knowledge graph embedding called RotatE, which is able to model and infer various relation patterns including: symmetry/antisymmetry, inversion, and composition. Specifically, the RotatE model defines each relation as a rotation from the source entity to the target entity in the complex vector space. In addition, we propose a novel self-adversarial negative sampling technique for efficiently and effectively training the RotatE model. Experimental results on multiple benchmark knowledge graphs show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.<br/>Accepted to ICLR 2019<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....