A Weight Importance Analysis Technique for Area- and Power-Efficient Binary Weight Neural Network Processor Design
0202 electrical engineering, electronic engineering, information engineering
02 engineering and technology
DOI:
10.1007/s12559-020-09794-6
Publication Date:
2021-01-04T20:03:55Z
AUTHORS (6)
ABSTRACT
Recently, the binary weight neural network (BWNN) processor design has attracted lots of attention due to its low computational complexity and memory demands. For the design of BWNN processor, emerging memory technologies such as RRAM can be used to replace conventional SRAM to save area and accessing power. However, RRAM is prone to bit errors, leading to reduced classification accuracy. To combine BWNN and RRAM to reduce the area overhead and power consumption while maintaining a high classification accuracy is a significant research challenge. In this work, we propose an automatic weight importance analysis technique and a mixed weight storage scheme to address the above-mentioned issue. For demonstration, we applied the proposed techniques to two typical BWNNs. The experimental results show that more than 78% (40%) area saving and 57% (30%) power saving can be achieved with less than 1% accuracy loss. The proposed techniques are applicable in resource- and power-constrained neural network processor design and show significant potentials for AI-based Internet-of-Things (IoT) devices that usually have low computational and storage resources.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (31)
CITATIONS (3)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....