Correlation-Attention Masked Temporal Transformer for User Identity Linkage Using Heterogeneous Mobility Data

Linkage (software)
DOI: 10.1609/aaai.v39i12.33418 Publication Date: 2025-04-11T12:07:29Z
ABSTRACT
With the rise of social media and Location-Based Social Networks (LBSN), check-in data across platforms has become crucial for User Identity Linkage (UIL). These not only reveal users' spatio-temporal information but also provide insights into their behavior patterns interests. However, cross-platform identity linkage faces challenges like poor quality, high sparsity, noise interference, which hinder existing methods from extracting user information. To address these issues, we propose a Correlation-Attention Masked Transformer Link age Network (MT-Link), transformer-based framework to enhance model performance by learning co-occurrence users. Our effectively captures in sequences. It employs correlation attention mechanism detect between Guided weight maps, focuses on points while filtering out noise, ultimately improving classification performance. Experimental results show that our significantly outperforms state-of-the-art baselines 12.92%-17.76% 5.80%-8.38% improvements terms Macro-F1 Area Under Curve (AUC).
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (1)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....