Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

FOS: Computer and information sciences Computer Science - Machine Learning Statistics - Machine Learning 0202 electrical engineering, electronic engineering, information engineering Machine Learning (stat.ML) 02 engineering and technology Machine Learning (cs.LG)
DOI: 10.48550/arxiv.1906.02107 Publication Date: 2019-01-01
ABSTRACT
Accepted at 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada - Updated ImageNet results<br/>Optimization of Binarized Neural Networks (BNNs) currently relies on real-valued latent weights to accumulate small update steps. In this paper, we argue that these latent weights cannot be treated analogously to weights in real-valued networks. Instead their main role is to provide inertia during training. We interpret current methods in terms of inertia and provide novel insights into the optimization of BNNs. We subsequently introduce the first optimizer specifically designed for BNNs, Binary Optimizer (Bop), and demonstrate its performance on CIFAR-10 and ImageNet. Together, the redefinition of latent weights as inertia and the introduction of Bop enable a better understanding of BNN optimization and open up the way for further improvements in training methodologies for BNNs. Code is available at: https://github.com/plumerai/rethinking-bnn-optimization<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....