On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices
MNIST database
Backpropagation
DOI:
10.3389/fnins.2020.00423
Publication Date:
2020-07-07T06:47:32Z
AUTHORS (11)
ABSTRACT
Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing with very low power consumption and massively parallel operation. To train SNNs supervision, we propose efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of proposed is close to conventional artificial (ANNs) using stochastic characteristics neurons. In configuration, gated Schottky diodes (GSDs) used synaptic devices, which have saturated current respect input voltage. design SNN GSDs, can update their conductance in speed up overall system. The performance validated through MNIST data set classification based on network size total time step. systems achieve 97.83% 1 hidden layer 98.44% 4 layers fully connected networks. then evaluate effect non-linearity asymmetry response long-term potentiation (LTP) depression (LTD) addition, impact device variations evaluated.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (51)
CITATIONS (39)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....