Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Sequence (biology)
DOI:
10.1609/aaai.v35i12.17325
Publication Date:
2022-09-08T19:38:33Z
AUTHORS (7)
ABSTRACT
Many real-world applications require the prediction of long sequence time-series, such as electricity consumption planning. Long time-series forecasting (LSTF) demands a high capacity model, which is ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown potential Transformer increase capacity. However, there are several severe issues with that prevent it from being directly applicable LSTF, including quadratic time complexity, memory usage, inherent limitation encoder-decoder architecture. To address these issues, we design an efficient transformer-based model for named Informer, three distinctive characteristics: (i) ProbSparse self-attention mechanism, achieves O(L log L) in complexity has comparable performance on sequences' alignment. (ii) distilling highlights dominating attention by halving cascading layer input, efficiently handles extreme sequences. (iii) generative style decoder, while conceptually simple, predicts sequences at one forward operation rather than step-by-step way, drastically improves inference speed long-sequence predictions. Extensive experiments four large-scale datasets demonstrate Informer significantly outperforms existing methods provides new solution LSTF problem.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (2740)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....