A Signal Propagation Perspective for Pruning Neural Networks at Initialization

Initialization Pruning SIGNAL (programming language)
DOI: 10.48550/arxiv.1906.06307 Publication Date: 2019-01-01
ABSTRACT
Network pruning is a promising avenue for compressing deep neural networks. A typical approach to starts by training model and then removing redundant parameters while minimizing the impact on what learned. Alternatively, recent shows that can be done at initialization prior training, based saliency criterion called connection sensitivity. However, it remains unclear exactly why an untrained, randomly initialized network effective. In this work, noting sensitivity as form of gradient, we formally characterize conditions ensure reliable measurements, which in turn yields effective results. Moreover, analyze signal propagation properties resulting pruned networks introduce simple, data-free method improve their trainability. Our modifications existing lead improved results all tested models image classification tasks. Furthermore, empirically study effect supervision demonstrate our perspective, combined with unsupervised pruning, useful various scenarios where applied non-standard arbitrarily-designed architectures.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....