Cells in Multidimensional Recurrent Neural Networks
Connectionism
DOI:
10.48550/arxiv.1412.2620
Publication Date:
2014-01-01
AUTHORS (5)
ABSTRACT
The transcription of handwritten text on images is one task in machine learning and solution to solve it using multi-dimensional recurrent neural networks (MDRNN) with connectionist temporal classification (CTC). RNNs can contain special units, the long short-term memory (LSTM) cells. They are able learn term dependencies but they get unstable when dimension chosen greater than one. We defined some useful necessary properties for one-dimensional LSTM cell extend them case. Thereby we introduce several new cells better stability. present a method design theory linear shift invariant systems. compared IFN/ENIT Rimes database, where improve recognition rate cell. So each application MDRNNs used could be improved by substituting developed
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....