Training Convolutional Neural Networks with the Forward-Forward algorithm

MNIST database Backpropagation Hyperparameter Rprop
DOI: 10.48550/arxiv.2312.14924 Publication Date: 2023-01-01
ABSTRACT
The recent successes in analyzing images with deep neural networks are almost exclusively achieved Convolutional Neural Networks (CNNs). training of these CNNs, and fact all network architectures, uses the backpropagation algorithm where output is compared desired result difference then used to tune weights towards outcome. In a 2022 preprint, Geoffrey Hinton suggested an alternative way which passes results together at input network. This so called Forward (FF) has up now only been fully connected networks. this paper, we show how FF paradigm can be extended CNNs. Our FF-trained CNN, featuring novel spatially-extended labeling technique, achieves classification accuracy 99.16% on MNIST hand-written digits dataset. We different hyperparameters affect performance proposed compare CNN trained standard approach. Furthermore, use Class Activation Maps investigate type features learnt by algorithm.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....