Implementation of Convolutional Neural Networks in Memristor Crossbar Arrays with Binary Activation and Weight Quantization
Memristor
DOI:
10.1021/acsami.3c13775
Publication Date:
2024-01-01T11:33:47Z
AUTHORS (7)
ABSTRACT
We propose a hardware-friendly architecture of convolutional neural network using 32 × memristor crossbar array having an overshoot suppression layer. The gradual switching characteristics in both set and reset operations enable the implementation 3-bit multilevel operation whole that can be utilized as 16 kernels. Moreover, binary activation function mapped to read voltage ground is introduced evaluate result training with boundary 0.5 its estimated gradient. Additionally, we adopt fixed kernel method, where inputs are sequentially applied differential pair scheme, reducing unused cell waste. has robust against device state variations, neuron circuit experimentally demonstrated on customized breadboard. Thanks analogue device, accurate vector–matrix multiplication (VMM) by combining sequential weights obtained through tuning array. In addition, feature images extracted VMM during hardware inference 100 test samples classified, classification performance off-chip compared software results. Finally, results depending tolerance statistically verified several cycles.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (68)
CITATIONS (24)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....