Attention meets long short-term memory: A deep learning network for traffic flow forecasting
Benchmark (surveying)
Component (thermodynamics)
DOI:
10.1016/j.physa.2021.126485
Publication Date:
2021-10-09T11:31:10Z
AUTHORS (6)
ABSTRACT
Abstract Accurate forecasting of future traffic flow has a wide range of applications, which is a fundamental component of intelligent transportation systems. However, timely and accurate traffic forecasting remains an open challenge due to the high nonlinearity and volatility of traffic flow data. Canonical long short-term memory (LSTM) networks are easily drawn to focus on min-to-min fluctuations rather than the long term dependencies of the traffic flow evolution. To address this issue, we propose to introduce an attention mechanism to the long short-term memory network for short-term traffic flow forecasting. The attention mechanism helps the network model to assign different weights to different inputs, focus on critical and important information, and make accurate predictions. Extensive experiments on four benchmark data sets show that the LSTM network equipped with an attention mechanism has superior performance compared with commonly used and state-of-the-art models.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (46)
CITATIONS (76)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....