An Enhanced Gated Recurrent Unit with Auto-Encoder for Solving Text Classification Problems

Benchmark (surveying) Reset (finance)
DOI: 10.1007/s13369-021-05691-8 Publication Date: 2021-05-22T11:02:31Z
ABSTRACT
Classification has become an important task for automatically categorizing documents based on their respective group. The purpose of classification is to assign the pre-specified group or class to an instance based on the observed features related to that instance. For accurate text classification, feature selection techniques are normally used to identify important features and to remove irrelevant, undesired and noisy features for minimizing the dimensionality of feature space. Therefore, in this research, a new model namely Encoder Simplified GRU (ES-GRU) is proposed to reduce dimension of data using an auto-encoder (AE). Gated Recurrent Unit (GRU) is a deep learning algorithm that contains update gate and reset gate, which is considered as one of the most efficient text classification technique, specifically on sequential datasets. Accordingly, the reset gate is replaced with an update gate in order to reduce the redundancy and complexity in the standard GRU. The proposed model has been evaluated on five benchmark text datasets and compared with six baseline well-known text classification approaches, which includes standard GRU, AE, Long Short-Term Memory, Convolutional Neural Network, Support Vector Machine, and Naive Bayes. Based on various types of performance evaluation parameters, a considerable amount of improvement has been observed in the performance of the proposed model as compared to state-of-the-art approaches.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (58)
CITATIONS (22)