An Improved Time Feedforward Connections Recurrent Neural Networks
Feed forward
Feedforward neural network
DOI:
10.32604/iasc.2023.033869
Publication Date:
2023-03-15T03:44:52Z
AUTHORS (3)
ABSTRACT
Recurrent Neural Networks (RNNs) have been widely applied to deal with temporal problems, such as flood forecasting and financial data processing. On the one hand, traditional RNNs models amplify gradient issue due strict time serial dependency, making it difficult realize a long-term memory function. other cells are highly complex, which will significantly increase computational complexity cause waste of resources during model training. In this paper, an improved Time Feedforward Connections (TFC-RNNs) was first proposed address issue. A parallel branch introduced for hidden state at t − 2 be directly transferred without nonlinear transformation 1. This is effective in improving dependence RNNs. Then, novel cell structure named Single Gate Unit (SGRU) presented. can reduce number parameters cell, consequently reducing complexity. Next, applying SGRU TFC-RNNs new TFC-SGRU solves above two difficulties. Finally, performance our verified through several experiments terms anti-interference capabilities. Experimental results demonstrated that capture helpful information step 1500 effectively filter out noise. The accuracy better than LSTM GRU regarding language processing ability.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (40)
CITATIONS (1)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....