Quaternion Recurrent Neural Networks
Sequence (biology)
Representation
Code (set theory)
DOI:
10.48550/arxiv.1806.04418
Publication Date:
2018-01-01
AUTHORS (7)
ABSTRACT
Recurrent neural networks (RNNs) are powerful architectures to model sequential data, due their capability learn short and long-term dependencies between the basic elements of a sequence. Nonetheless, popular tasks such as speech or images recognition, involve multi-dimensional input features that characterized by strong internal dimensions vector. We propose novel quaternion recurrent network (QRNN), alongside with long-short term memory (QLSTM), take into account both external relations these structural algebra. Similarly capsules, quaternions allow QRNN code composing processing multidimensional single entities, while operation reveals correlations show QLSTM achieve better performances than RNN LSTM in realistic application automatic recognition. Finally, we reduce maximum factor 3.3x number free parameters needed, compared real-valued RNNs LSTMs reach results, leading more compact representation relevant information.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....