ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural Networks

Pruning
DOI: 10.1609/aaai.v37i1.25079 Publication Date: 2023-06-27T16:09:01Z
ABSTRACT
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process. To take full advantage of low improve efficiency these models further, pruning methods been explored to find sparse SNNs without redundancy connections after training. However, parameter still hinders In human brain, rewiring process is highly dynamic, while synaptic maintain relatively brain development. Inspired by this, here we propose an efficient evolutionary structure learning (ESL) framework for SNNs, named ESL-SNNs, implement SNN training from scratch. The regeneration evolve dynamically learning, yet keep structural sparsity at a certain level. As result, ESL-SNNs can search optimal connectivity exploring all possible parameters across time. Our experiments show that proposed able learn with structures effectively reducing limited accuracy. achieve merely 0.28% accuracy loss 10% connection density on DVS-Cifar10 dataset. work presents brand-new approach scratch biologically plausible mechanisms, closing gap expressibility between dense Hence, it has great potential lightweight small memory usage.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (19)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....