Negative Feedback Matters: Exploring Positive and Negative Correlations for Time Series Anomaly Detection

DOI: 10.3390/electronics14102068 Publication Date: 2025-05-20T12:41:12Z
ABSTRACT
Recently, graph neural networks (GNNs) have demonstrated remarkable success in multivariable time series anomaly detection, particularly by explicitly modeling inter-variable relationships. However, to prevent the distinct pattern of one variable from introducing noise to unrelated variables, existing methods focus solely on leveraging positive correlations among neighbors for relationship modeling while neglecting the role of negative correlations. This limitation hinders their effectiveness in complex scenarios where both positive and negative dependencies are critical. To address this challenge, we propose PNGDN, a novel GNN framework that incorporates both positive and negative correlations to enhance anomaly-detection performance. Notably, PNGDN introduces a correlational graph structure learning module that simultaneously captures positive and negative dependencies. It quantitatively filters out spurious relationships based on the value of similarity, which serves as a unified threshold to screen both positive and negative correlations, allowing the model to focus on truly meaningful correlations among variables. Additionally, an attention-based information propagation mechanism ensures the efficient propagation of information under positive and negative correlations, facilitating accurate predictions for each variable. Extensive experiments on three benchmark time series anomaly detection datasets validate the superior performance of PNGDN.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (39)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....