Transformer-based Reasoning for Learning Evolutionary Chain of Events on Temporal Knowledge Graph

Knowledge graph
DOI: 10.1145/3626772.3657706 Publication Date: 2024-07-11T16:40:05Z
ABSTRACT
Temporal Knowledge Graph (TKG) reasoning often involves completing missing factual elements along the timeline. Although existing methods can learn good embeddings for each element in quadruples by integrating temporal information, they fail to infer evolution of facts. This is mainly because (1) insufficiently exploring internal structure and semantic relationships within individual (2) inadequately learning a unified representation contextual correlations among different quadruples. To overcome these limitations, we propose novel Transformer-based model (dubbed ECEformer) TKG Evolutionary Chain Events (ECE). Specifically, unfold neighborhood subgraph an entity node chronological order, forming evolutionary chain events as input our model. Subsequently, utilize Transformer encoder intra-quadruples ECE. We then craft mixed-context module based on multi-layer perceptron (MLP) representations inter-quadruples ECE while accomplishing knowledge reasoning. In addition, enhance timeliness events, devise additional time prediction task complete effective information learned representation. Extensive experiments six benchmark datasets verify state-of-the-art performance effectiveness method.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (49)
CITATIONS (6)