MAAT: Mamba Adaptive Anomaly Transformer with association discrepancy for time series
Anomaly (physics)
Association (psychology)
DOI:
10.48550/arxiv.2502.07858
Publication Date:
2025-02-11
AUTHORS (5)
ABSTRACT
Anomaly detection in time series is essential for industrial monitoring and environmental sensing, yet distinguishing anomalies from complex patterns remains challenging. Existing methods like the Transformer DCdetector have progressed, but they face limitations such as sensitivity to short-term contexts inefficiency noisy, non-stationary environments. To overcome these issues, we introduce MAAT, an improved architecture that enhances association discrepancy modeling reconstruction quality. MAAT features Sparse Attention, efficiently capturing long-range dependencies by focusing on relevant steps, thereby reducing computational redundancy. Additionally, a Mamba-Selective State Space Model incorporated into module, utilizing skip connection Gated Attention improve anomaly localization performance. Extensive experiments show significantly outperforms previous methods, achieving better distinguishability generalization across various applications, setting new standard unsupervised real-world scenarios.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....