Barlow Twins: Self-Supervised Learning via Redundancy Reduction

FOS: Computer and information sciences Computer Science - Machine Learning Artificial Intelligence (cs.AI) Computer Science - Artificial Intelligence Quantitative Biology - Neurons and Cognition Computer Vision and Pattern Recognition (cs.CV) FOS: Biological sciences Computer Science - Computer Vision and Pattern Recognition 0202 electrical engineering, electronic engineering, information engineering Neurons and Cognition (q-bio.NC) 02 engineering and technology Machine Learning (cs.LG)
DOI: 10.48550/arxiv.2103.03230 Publication Date: 2021-01-01
ABSTRACT
Self-supervised learning (SSL) is rapidly closing the gap with supervised methods on large computer vision benchmarks. A successful approach to SSL learn embeddings which are invariant distortions of input sample. However, a recurring issue this existence trivial constant solutions. Most current avoid such solutions by careful implementation details. We propose an objective function that naturally avoids collapse measuring cross-correlation matrix between outputs two identical networks fed distorted versions sample, and making it as close identity possible. This causes embedding vectors sample be similar, while minimizing redundancy components these vectors. The method called Barlow Twins, owing neuroscientist H. Barlow's redundancy-reduction principle applied pair networks. Twins does not require batches nor asymmetry network twins predictor network, gradient stopping, or moving average weight updates. Intriguingly benefits from very high-dimensional output outperforms previous ImageNet for semi-supervised classification in low-data regime, par state art linear classifier head, transfer tasks object detection.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....