Gated Attention Coding for Training High-Performance and Efficient Spiking Neural Networks

FOS: Computer and information sciences 0202 electrical engineering, electronic engineering, information engineering Computer Science - Neural and Evolutionary Computing 02 engineering and technology Neural and Evolutionary Computing (cs.NE) 7. Clean energy
DOI: 10.1609/aaai.v38i1.27816 Publication Date: 2024-03-25T08:50:05Z
ABSTRACT
Spiking neural networks (SNNs) are emerging as an energy-efficient alternative to traditional artificial (ANNs) due their unique spike-based event-driven nature. Coding is crucial in SNNs it converts external input stimuli into spatio-temporal feature sequences. However, most existing deep rely on direct coding that generates powerless spike representation and lacks the temporal dynamics inherent human vision. Hence, we introduce Gated Attention (GAC), a plug-and-play module leverages multi-dimensional gated attention unit efficiently encode inputs powerful representations before feeding them SNN architecture. GAC functions preprocessing layer does not disrupt spike-driven nature of SNN, making amenable efficient neuromorphic hardware implementation with minimal modifications. Through observer model theoretical analysis, demonstrate GAC's mechanism improves efficiency. Experiments CIFAR10/100 ImageNet datasets achieves state-of-the-art accuracy remarkable Notably, improve top-1 by 3.10% CIFAR100 only 6-time steps 1.07% while reducing energy usage 66.9% previous works. To our best knowledge, first time explore attention-based dynamic scheme SNNs, exceptional effectiveness efficiency large-scale datasets. Code available at https://github.com/bollossom/GAC.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (11)