Spatial-Winograd Pruning Enabling Sparse Winograd Convolution

Pruning Convolution (computer science)
DOI: 10.48550/arxiv.1901.02132 Publication Date: 2019-01-01
ABSTRACT
Deep convolutional neural networks (CNNs) are deployed in various applications but demand immense computational requirements. Pruning techniques and Winograd convolution two typical methods to reduce the CNN computation. However, they cannot be directly combined because transformation fills sparsity resulting from pruning. Li et al. (2017) propose sparse which weights pruned domain, this technique is not very practical Winograd-domain retraining requires low learning rates hence significantly longer training time. Besides, Liu (2018) move ReLU function into can help increase weight changes network structure. To achieve a high without changing structures, we new pruning method, spatial-Winograd As first step, spatial-domain structured way, efficiently transfers domain avoids retraining. For next also perform use an importance factor matrix adjust gradients. This adjustment makes it possible effectively retrain three models on datasets of CIFAR10, CIFAR-100, ImageNet, our proposed method sparsities 63%, 50%, 74%, respectively.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....