Digit recognition using decimal coding and artificial neural network
Decimal
Multilayer perceptron
Backpropagation
Digit recognition
Numerical digit
DOI:
10.48129/kjs.v49i1.9556
Publication Date:
2021-12-06T18:30:28Z
AUTHORS (6)
ABSTRACT
Current artificial neural network image recognition techniques use all the pixels of an as input. In this paper, we present efficient method for handwritten digit that involves extracting characteristics a by coding each row decimal value, i.e., transforming binary representation into value. This is called rows. The set values calculated from initial arranged vector and normalized; these represent inputs to network. approach proposed in work uses multilayer perceptron classification, recognition, prediction digits 0 9. study, dataset 1797 samples were obtained database imported Scikit-learn library. Backpropagation was used learning algorithm train results show achieves better performance than two other schemes terms accuracy execution time.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....